Creating a linear gradient in 2D array

前端 未结 4 1246
北海茫月
北海茫月 2021-02-06 13:05

I have a 2D bitmap-like array of let\'s say 500*500 values. I\'m trying to create a linear gradient on the array, so the resulting bitmap would look something like this (in gray

4条回答
  •  感动是毒
    2021-02-06 13:21

    In your example image, it looks like you have a radial gradient. Here's my impromtu math explanation for the steps you'll need. Sorry for the math, the other answers are better in terms of implementation.

    1. Define a linear function (like y = x + 1) with the domain (i.e. x) being from the colour you want to start with to the colour your want to end with. You can think of this in terms of a range the within Ox0 to OxFFFFFF (for 24 bit colour). If you want to handle things like brightness, you'll have to do some tricks with the range (i.e. the y value).
    2. Next you need to map a vector across the matrix you have, as this defines the direction that the colours will change in. Also, the colour values defined by your linear function will be assigned at each point along the vector. The start and end point of the vector also define the min and max of the domain in 1. You can think of the vector as one line of your gradient.
    3. For each cell in the matrix, colours can be assigned a value from the vector where a perpendicular line from the cell intersects the vector. See the diagram below where c is the position of the cell and . is the the point of intersection. If you pretend that the colour at . is Red, then that's what you'll assign to the cell.
                 |
                 c
                 |
                 |
        Vect:____.______________
                 |
                 |
    
    

提交回复
热议问题