Neo4j & Spring Data Neo4j 4.0.0 : Importing large datasets

非 Y 不嫁゛ 提交于 2021-02-10 11:53:33

问题


I want to insert real-time logging data into Neo4j 2.2.1 through Spring Data Neo4j 4.0.0. The logging data is very big which may reach hundreds of thousands records. How is the best way to implement this kind of functionality? Is it safe to just using the .save(Iterable) method at the end of all the node entity objects creation? Is there something like batch insertion mechanism in Spring Data Neo4j 4.0.0? Thanks in advance!


回答1:


As SDN4 can work with existing databases directly you can use neo4j-import for initial imports.

From Neo4j 2.2. we can also sustain highly concurrent write loads of parametrized cypher, I think you should be able to just multi-thread adding data to Neo4j using SDN4. I.e. create let's say 1000 to 10k objects per batch and send them off.

Otherwise you can just send parametrized Cypher directly concurrently to Neo4j.



来源:https://stackoverflow.com/questions/30494097/neo4j-spring-data-neo4j-4-0-0-importing-large-datasets

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!