So I am attempting to hard code a Saturation filter based on a built in function from another language. (Graphics Mode Saturation)
All to do with Luminance / HSB and colour space etc with no record of algorithmic process.
This thread will be dedicated to my findings if anyone wishes to contribute in ripping this function apart!..
Note: all values >0<1
Image shows a random background colour set to initial random R1=G1=B1 (middle 3 values)
A rectangle is drawn with random R2,G2,B2 values using the applications built in screen filter over this background.
Values at this time are completely irrelevant and should be ignored for now. (Top 3 values)
The results always show R3=G3 but B3 is dependant on the initial R1,G1,B1 value and B3 always results > R1=G1. (Bottom 3 values are readings)
I have plotted random B1 (background value) against B3 (result value).
Thankfully it’s linear!! And any value of (R1+G1)> 0.5 results in B3=1
To make a scenario easier.
If I set background to 0.2,0.2,0.2 this results in R3=G3=0.176471 and B3=0.376471.
If I set background to 0.4,0.4,0.4, this results in R3=G3=0.356863 and B3=0.756863
If I set background >~ 0.52745094895362 then R3=G3=B3=1
If I set background<~ 0.0019607840804383156516, this results in R3=G3=B3=0
0.0019607840804383156517 results in R3=G3=0.00392157 and B3=0.00784314.
0.00392157=1/255 , the colour depth.