This is basic graphics geometry and/or trig, and I feel dumb for asking it, but I can\'t remember how this goes. So:
A useful rule of thumb in this kind of computational geometry is that you should work with vectors as long as you can, switching to Cartesian coordinates only as a last resort. So let's solve this using vector algebra. Suppose your line goes from p to p + r, and the other point is q.
Now, any point on the line, including the point you are trying to find (call it s), can be expressed as s = p + λ r for a scalar parameter λ.
Now the vector from q to s must be perpendicular to r. Therefore
(q − (p + λ r)) · r = 0
Where · is the dot product operator. Expand the product:
(q − p) · r = λ (r · r)
And divide:
λ = (q − p) · r / r · r
When you come to implement it, you need to check whether r · r = 0, to avoid division by zero.