I have a 2D bitmap-like array of let\'s say 500*500 values. I\'m trying to create a linear gradient on the array, so the resulting bitmap would look something like this (in gray
This is really a math question, so it might be debatable whether it really "belongs" on Stack Overflow, but anyway: you need to project the coordinates of each point in the image onto the axis of your gradient and use that coordinate to determine the color.
Mathematically, what I mean is:
A = (x2 - x1)
and B = (y2 - y1)
C1 = A * x1 + B * y1
for the starting point and C2 = A * x2 + B * y2
for the ending point (C2
should be larger than C1
)C = A * x + B * y
If C <= C1
, use the starting color; if C >= C2
, use the ending color; otherwise, use a weighted average:
(start_color * (C2 - C) + end_color * (C - C1))/(C2 - C1)
I did some quick tests to check that this basically worked.