Hi, thanks for the info. I've started making changes :).
I would like to get the project robust/performant enough to handle a set of 100,000 images. It seems to make this practical there are a few areas to attack:
1). The O(n(n-1)) number of comparisons, which quickly becomes a large number. However I have a couple ideas on how to reduce it.
2). Loading images. Even though this is an O(n) process, the LoadImage API being used currently is super slow for this usage.
3). The SSIM calculation. I'm guessing this could be a lot faster by using an SIMD helper like Mono.Simd. This uses Intel SS3 to do batch floating point ops.
4). Robustness. Sine this is such a long running operation for n=100000, it really helps to have detailed progress info and possibly the ability to restart after a crash at the place last left off.
Thanks for offering to make me the maintainer but for now I'm just experimenting.
If you have an ideas please drop me a note anytime.
- I don't know of any other forks.
- Yeah, work stopped--not necessarily because it was school related (there were other things I wanted to do with this project), but mostly because everyone who worked on it is busy.
Feel free to fork away, or if you want to I can make you a maintainer on this project and you're welcome to hack away. Let me know if you have any questions and I'll be happy to answer.