I have recently created a Google Map using V3 of the API (latest version). One of my requirements is that I am able to render over 1 million markers (in a reasonable time). A reasonable time would be under 15 seconds.
I know that it is fairly crazy to render all 1 million markers at once which is why I have investigated performance options. One of the options I came across and utilized is the MarkerClusterer: https://developers.google.com/maps/articles/toomanymarkers
However, I am now starting to see performance issues when testing the MarkerClusterer with over 100,000 markers as it is taking a long time to render the map and markers (1 min+). Eventually, I have managed to make the page crash with around 200,000 markers.
Is there any way to improve the performance of the map when using this many markers?
Thanks in advance for any help.
I have used Google Fusion Tables successfully and it is very fast and quite simple - once you work out how to use OAuth2....
The tables are limited to 100,000 entries each and you upload them from a CSV file - either by going to your Google Drive in a browser or programmatically using curl or Perl.
To get beyond the 100,000 element limit, you can add 5 layers to your map but that will still only get you to 500,000 points. Can't suggest anything more than that.
My project is here if you want a look.
来源:https://stackoverflow.com/questions/22450927/google-maps-v3-rendering-over-1-million-markers-in-a-reasonable-time