Python + MongoDB - Cursor iteration too slow

萝らか妹 提交于 2019-12-31 14:34:13

问题


I'm actually working in a search engine project.
We are working with python + mongoDb.
I'm having the following problem:

I have a pymongo cursor after excecuting a find() command to the mongo db.
The pymongo cursor have around 20k results.

I have noticed that the iteration over the pymongo cursor is really slow compared with a normal iteration over for example a list of the same size.

I did a little benchmark:

-iteration over a list of 20k strings: 0.001492 seconds
-iteration over a pymongo cursor with 20k results: 1.445343 seconds

The difference is really a lot. Maybe not a problem with this amounts of results, but if I have millons of results the time would be unacceptable.

Has anyone got an idea of why pymongo cursors are too slow to iterate?
Any idea of how can I iterate the cursor in less time?

Some extra info:

  • Python v2.6
  • PyMongo v1.9
  • MongoDB v1.6 32 bits

回答1:


Remember the pymongo driver is not giving you back all 20k results at once. It is making network calls to the mongodb backend for more items as you iterate. Of course it wont be as fast as a list of strings. However, I'd suggest trying to adjust the cursor batch_size as outlined in the api docs:




回答2:


Is your pymongo installation using the included C extensions?

>>> import pymongo
>>> pymongo.has_c()
True

I spent most of last week trying to debug a moderate-sized query and corresponding processing that took 20 seconds to run. Once the C extensions were installed, the whole same process took roughly a second.

To install the C extensions in Debian, install the python development headers before running easy install. In my case, I also had to remove the old version of pymongo. Note that this will compile a binary from C, so you need all the usual tools. (GCC, etc)

# on ubuntu with pip
$ sudo pip uninstall pymongo
$ sudo apt-get install python-dev build-essential
$ sudo pip install pymongo



回答3:


the default cursor size is 4MB, and the maximum it can go to is 16MB. you can try to increase your cursor size until that limit is reached and see if you get an improvement, but it also depends on what your network can handle.




回答4:


Sorry but this is a very wild claim without much evidence. You don't provide any information about the overall document sizes. Fetch such an amount of document requires both network traffic and IO on the database server. The performance is sustained "bad" even in "hot" state with warm caches? You can use "mongosniff" in order to inspect the "wire" activity and system tools like "iostat" to monitor the disk activity on the server. In addition "mongostat" gives a bunch of valuable information".



来源:https://stackoverflow.com/questions/5480340/python-mongodb-cursor-iteration-too-slow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!