I have a couple of javascript libraries (angular-ahdin, J-I-C), that I can use to compress an image the user uploaded before I submit it to the back end.
All libraries I've seen take a quality parameter and use jpeg compression to reduce the file. You have no idea before the compression what file size the resulting image will be, based on any quality value.
My idea is to use a "binary search" type algorithm to try different quality percents until I finally get an image that is just under the target max file size.
It would start with 50% jpeg quality. If the compressed image is under the target file size, then go to 75% quality, else go to 25% quality, and so on. It would hit the target file size to a granularity of 1% within 6 iterations guaranteed, then I would stop.
Assuming there is not a library which already has this feature, is there a better way than binary search? Is there any image research which indicates a better seed value than 50%?