I have made a Scrapy spider that can be successfully run from a script located in the root directory of the project. As I need to run multiple spiders from different project
It should work , can you share your scrapy log file
Edit: your approach will not work because ...when you execute the script..it will look for your default settings in
Solution 1 create a cfg file inside the directory (outside folder) and give it a path to the valid settings.py file
Solution 2 make your parent directory package , so that absolute path will not be required and you can use relative path
i.e python -m cron.project1
Solution 3
Also you can try something like
Let it be where it is , inside the project directory..where it is working...
Create a sh file...
Now you can execute spiders via this sh file when requested by django