This is Windows 7 with python 2.7
I have a scrapy project in a directory called caps (this is where scrapy.cfg is)
My spider is located in caps\\caps\\spiders\\c
You have to give a name to your spider.
However, BaseSpider is deprecated, use Spider instead.
from scrapy.spiders import Spider
class campSpider(Spider):
name = 'campSpider'
The project should have been created by the startproject command:
scrapy startproject project_name
Which gives you the following directory tree:
project_name/
scrapy.cfg # deploy configuration file
project_name/ # project's Python module, you'll import your code from here
__init__.py
items.py # project items file
pipelines.py # project pipelines file
settings.py # project settings file
spiders/ # a directory where you'll later put your spiders
__init__.py
...
Make sure that settings.py has the definition of your spider module. eg:
BOT_NAME = 'bot_name' # Usually equals to your project_name
SPIDER_MODULES = ['project_name.spiders']
NEWSPIDER_MODULE = 'project_name.spiders'
You should have no problems to run your spider locally or on ScrappingHub.