geohashing

Finding geohashes of certain length within radius from a point

倖福魔咒の 提交于 2019-12-03 06:08:15
问题 I have points with a given latlong and a distance around them - e.g. { 40.6826048,-74.0288632 : 20 miles, 51.5007825,-0.1258957 : 100 miles}. If I pick a fixed geohash length (say equals to ~ 1x1mile) how can I find all the geohash entries of that length that are with the given radius from each point? To add some background - the reason I want to do that is so I can save a cache keyed by the geohash id with a value of the list of points for which the given geohash is within radius (and also

Python calculate lots of distances quickly

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-03 04:06:51
I have an input of 36,742 points which means if I wanted to calculate the lower triangle of a distance matrix (using the vincenty approximation) I would need to generate 36,742*36,741*0.5 = 1,349,974,563 distances. I want to keep the pair combinations which are within 50km of each other. My current set-up is as follows shops= [[id,lat,lon]...] def lower_triangle_mat(points): for i in range(len(shops)-1): for j in range(i+1,len(shops)): yield [shops[i],shops[j]] def return_stores_cutoff(points,cutoff_km=0): below_cut = [] counter = 0 for x in lower_triangle_mat(points): dist_km = vincenty(x[0]

Spatial data with mongodb or cassandra

别等时光非礼了梦想. 提交于 2019-12-03 02:43:00
问题 I am considering a Proof of concept for handling large volumes of data like > 10 G which requires atleast 200+ writes per second and about 50+ reads per second of spatial related data. This is a growing system as well. Currently I am considering moving this big volume data into a NoSql big table kind of db for performance reasons. I have considered and taken some closer look at MongoDB and cassandra. As far as my reading goes, Mongodb: - seems to have a writer lock problem - one of the posts

Spatial data with mongodb or cassandra

好久不见. 提交于 2019-12-02 16:17:54
I am considering a Proof of concept for handling large volumes of data like > 10 G which requires atleast 200+ writes per second and about 50+ reads per second of spatial related data. This is a growing system as well. Currently I am considering moving this big volume data into a NoSql big table kind of db for performance reasons. I have considered and taken some closer look at MongoDB and cassandra. As far as my reading goes, Mongodb: - seems to have a writer lock problem - one of the posts in stackoverflow suggested this db if there is no need for multiple servers - indexes kept on memory.

Ordering Firestore GeoHash query from closest to furthest?

情到浓时终转凉″ 提交于 2019-11-30 09:52:38
问题 Currently, I'm using parts of the GeoFirebase library along with Firestore to allow for geoquerying. When I set the geohash of a post, I do it as such if let geoHash = GFGeoHash(location: location.coordinate).geoHashValue { However, to make the geohash querying less specific, I'm planning on truncating part of the geohash when I query; currently, the query looks similar to this var geoQuerySpecific = GFGeoHashQuery() let geoQueryHash = GFGeoHashQuery.queries(forLocation: (lastLocation?

Ordering Firestore GeoHash query from closest to furthest?

|▌冷眼眸甩不掉的悲伤 提交于 2019-11-29 17:00:07
Currently, I'm using parts of the GeoFirebase library along with Firestore to allow for geoquerying. When I set the geohash of a post, I do it as such if let geoHash = GFGeoHash(location: location.coordinate).geoHashValue { However, to make the geohash querying less specific, I'm planning on truncating part of the geohash when I query; currently, the query looks similar to this var geoQuerySpecific = GFGeoHashQuery() let geoQueryHash = GFGeoHashQuery.queries(forLocation: (lastLocation?.coordinate)!, radius: (30)) as! Set<GFGeoHashQuery> for query in geoQueryHash { geoQuerySpecific = query

In solr dih import two double in one location

柔情痞子 提交于 2019-11-29 07:12:48
What I have now is the two double filds: <field name="x_geo_x_coordinate" type="double" indexed="true" stored="true" default="0"/> <field name="x_geo_y_coordinate" type="double" indexed="true" stored="true" default="0"/> and what I want: the 2 double value in one location field: <field name="x_geo" type="location" indexed="true" stored="true" default="0.0,0.0"/> What I tried so far and does't work: <copyField source="*_coordinate" dest="x_geo"/> <copyField source="x_geo_str" dest="x_geo"/> Any simple solution? Thanks in advance! Well, you where right @nikhil500. ScriptTransformer is one answer

Google App Engine Geohashing

ぃ、小莉子 提交于 2019-11-28 19:42:21
I am writing a web application using GWT and App Engine. My application will need to post and query items based on their latitude, longitude. As a result of google's distributed database design you can't simple query a set of inequalities. Instead they suggest doing geohashing. The method is described on this page. http://code.google.com/appengine/articles/geosearch.html Essentially you pre compute a bounding box so that you can query items that have been tagged with that bounding box. There is one part of the process that I don't understand. What does the "slice" attribute mean? Thanks for

geohash string length and accuracy

天大地大妈咪最大 提交于 2019-11-28 17:07:06
if length of geohash string is more, it is more accurate. But is there any direct relationship like if length is 7 it is providing 100 meter accuracy, i.e. if two geohash (and either of their bounding box) is having first 7 char matching, both should be near 100 meter etc? I am using geohash for finding, all near-by location for given geohash, with their distance Also any directway to calculate distance between two geo-hash? (one way is to decode them to lat/lng, and then calculate distance) Thanks Saw a lot of confusion around geohashing so I am posting my understanding so far. The principle

In solr dih import two double in one location

混江龙づ霸主 提交于 2019-11-28 00:47:24
问题 What I have now is the two double filds: <field name="x_geo_x_coordinate" type="double" indexed="true" stored="true" default="0"/> <field name="x_geo_y_coordinate" type="double" indexed="true" stored="true" default="0"/> and what I want: the 2 double value in one location field: <field name="x_geo" type="location" indexed="true" stored="true" default="0.0,0.0"/> What I tried so far and does't work: <copyField source="*_coordinate" dest="x_geo"/> <copyField source="x_geo_str" dest="x_geo"/>