Better image coloring logic/algorithm
I am developing an iOS app in which the user can change a part of an image's color, say a Tea Cup, by touching it. I am using Floodfill algorithm to fill colors so that the user has to tap on the Tea Cup to change its color. That's working fine. But, the final color looks little different than the replacement color. I have some problem finding out a better logic to convert the object's(Tea Cup) color to the selected color considering its saturation & lightness. I am using the following logic to get the result color. I am representing color as (hue, saturation, value) . touchedColor = (tchd_h,