![]() For example, this Canon G5 Xii image leaves 21% of the sensor's pixels unused: In the end, I went with the average method. The white area represents sensor pixels that contributed to the final OOC JPEG, the black area pixels were not used. This is what it looks like before applying the average filter: It's quite an easy way of coming up with a score. The first, moderately distorted one I tried this morning, they were at 236, the second very distorted one, only 192. For the perfect lens with no distortion, all the colours would be at 255. What I've done so far is to apply an average filter, so the whole image becomes a pale grey. I've already worked out how to get a monochrome image where the used pixels are white and the unused black, so it would be easy to use the histogram. You increase exposure to turn all pixels white and then fill the cropped area with black? How about creating a black and white image and using the histogram to give you an approximation. My simplest solution is to make the area black, and the rest of the image white, and then get that averaged out to a grey. There's probably some really easy way of calculating it, but I've not spotted it. I'm interested in calculating what percentage of the area of an image is represented by an irregular shape.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |