Dynamically assembling scrapy GET request string

旧时模样 提交于 2021-02-07 04:14:00

问题


I've been working with firebug and I've got the following dictionaries to query an api.

url = "htp://my_url.aspx#top"

querystring = {"dbkey":"x1","stype":"id","s":"27"}

headers = {
    'accept': "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
    'upgrade-insecure-requests': "1",
    'user-agent': "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 
    }

with python requests, using this is as simple as:

import requests
response = requests.request("GET", url, headers=headers, params=querystring)
print(response.text)

How can I use these in Scrapy? I've been reading http://doc.scrapy.org/en/latest/topics/request-response.html and I know that the following works for post:

        r = Request(my_url, method="post",  headers= headers, body=payload, callback=self.parse_method)

I've tried:

    r = Request("GET", url, headers=headers, body=querystring, callback=self.parse_third_request)

I'm getting:

r = Request("GET", url, headers=headers, body=querystring, callback=self.parse_third_request)
TypeError: __init__() got multiple values for keyword argument 'callback'

edit:

changed to :

    r = Request(method="GET", url=url, headers=headers, body=querystring, callback=self.parse_third_request)

now getting:

  File "C:\envs\r2\tutorial\tutorial\spiders\parker_spider.py", line 90, in parse_second_request
    r = Request(method="GET", url=url, headers=headers, body=querystring, callback=self.parse_third_request)
  File "C:\envs\virtalenvs\teat\lib\site-packages\scrapy\http\request\__init__.py", line 26, in __init__
    self._set_body(body)
  File "C:\envs\virtalenvs\teat\lib\site-packages\scrapy\http\request\__init__.py", line 68, in _set_body
    self._body = to_bytes(body, self.encoding)
  File "C:\envs\virtalenvs\teat\lib\site-packages\scrapy\utils\python.py", line 117, in to_bytes
    'object, got %s' % type(text).__name__)
TypeError: to_bytes must receive a unicode, str or bytes object, got dict

edit 2:

I now have:

    yield Request(method="GET", url=url, headers=headers, body=urllib.urlencode(querystring), callback=self.parse_third_request)

def parse_third_request(self, response):
    from scrapy.shell import inspect_response
    inspect_response(response, self)
    print("hi")
    return None

There are no errors but in the shell when I do "response.url" I only get the base url with no get parameters.


回答1:


Look at the signature of the Request initialization method:

class scrapy.http.Request(url[, callback, method='GET', headers, body, cookies, meta, encoding='utf-8', priority=0, dont_filter=False, errback])

GET string in your case is used as a positional value for the callback argument.

Use a keyword argument for the method instead (though GET is the default):

r = Request(url, method="GET", headers=headers, body=querystring, callback=self.parse_third_request)


来源:https://stackoverflow.com/questions/37632965/dynamically-assembling-scrapy-get-request-string

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!