Python, Mechanize - request disallowed by robots.txt even after set_handle_robots and add_headers

时光怂恿深爱的人放手 提交于 2019-12-01 21:31:17
andrean

Ok, so the same problem appeared in this question:

Why is mechanize throwing a HTTP 403 error?

By sending all the request headers a normal browser would send, and accepting / sending back the cookies the server sends should resolve the issue.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!