I want to execute my scrapy crawler from cron job .
i create bash file getdata.sh where scrapy project is located with it\'s spiders
#!/bin/bash
cd /
Another option is to forget using a shell script and chain the two commands together directly in the cronjob. Just make sure the PATH variable is set before the first scrapy cronjob in the crontab list. Run:
crontab -e
to edit and have a look. I have several scrapy crawlers which run at various times. Some every 5 mins, others twice a day.
PATH=/usr/local/bin
*/5 * * * * user cd /myfolder/crawlers/ && scrapy crawl my_spider_name_1
* 1,13 * * * user cd /myfolder/crawlers/ && scrapy crawl my_spider_name_2
All jobs located after the PATH variable will find scrapy. Here the first one will run every 5 mins and the 2nd twice a day at 1am and 1pm. I found this easier to manage. If you have other binaries to run then you may need to add their locations to the path.