I am having problems implementing the function described here here.
This is my Java implementation:
private static double[] pointRadialDistance(doubl
Thanks for your python code I tried setting it up in my use case where I'm trying to find the lat lon of a point in between two others at a set distance from the first point so it's quite similare to your code appart that my bearing is dynamically calculated
startpoint(lat1) lon1/lat1 = 55.625541,-21.142463
end point (lat2) lon2/lat2 = 55.625792,-22.142248
my result should be a point in between these two at lon3/lat3 unfortunetly I get lon3/lat3 = 0.0267695450609,0.0223553243666
I thought this might be a difference in lat lon but no when I add or sub it it's not good
any advice would be really great Thanks
here's my implementation
distance = 0.001 epsilon = 0.000001
y = math.sin(distance) * math.cos(lat2);
x = math.cos(lat1)*math.sin(lat2) - math.sin(lat1)*math.cos(lat2)*math.cos(distance);
bearing = math.atan2(y, x)
rlat1 = (lat1 * 180) / math.pi
rlon1 = (lon1 * 180) / math.pi
rbearing = (bearing * 180) / math.pi
rdistance = distance / R # normalize linear distance to radian angle
rlat = math.asin( math.sin(rlat1) * math.cos(rdistance) + math.cos(rlat1) * math.sin(rdistance) * math.cos(rbearing) )
if math.cos(rlat) == 0 or abs(math.cos(rlat)) < epsilon: # Endpoint a pole
rlon=rlon1
else:
rlon = ( (rlon1 + math.asin( math.sin(rbearing)* math.sin(rdistance) / math.cos(rlat) ) + math.pi ) % (2*math.pi) ) - math.pi
lat3 = (rlat * math.pi)/ 180
lon3 = (rlon * math.pi)/ 180