I am running a log of scrapy by doing this:
from scrapy import log
class MySpider(BaseSpider):
name = \"myspider\"
def __init__(self, name=None, **kwargs)
Just let logging do the job. Try to use PythonLoggingObserver instead of DefaultObserver:
INFO and one for ERROR messages) directly in python, or via fileconfig, or via dictconfig (see docs) start it in spider's __init__:
def __init__(self, name=None, **kwargs):
# TODO: configure logging: e.g. logging.config.fileConfig("logging.conf")
observer = log.PythonLoggingObserver()
observer.start()
Let me know if you need help with configuring loggers.
EDIT:
Another option is to start two file log observers in __init__.py:
from scrapy.log import ScrapyFileLogObserver
from scrapy import log
class MySpider(BaseSpider):
name = "myspider"
def __init__(self, name=None, **kwargs):
ScrapyFileLogObserver(open("spider.log", 'w'), level=logging.INFO).start()
ScrapyFileLogObserver(open("spider_error.log", 'w'), level=logging.ERROR).start()
super(MySpider, self).__init__(name, **kwargs)
...