I cannot get my head around it. I want to insert the values of a dictionary into a sqlite databse.
url = \"https://api.flickr.com/services/rest/?method=flickr.ph
You can use named parameters and insert all rows at once using executemany().
As a bonus, you would get a good separation of html-parsing and data-pipelining logic:
data = [{"id_p": photo.get('id'),
"title_p": photo.get('title'),
"tags_p": photo.get('tags'),
"latitude_p": photo.get('latitude'),
"longitude_p": photo.get('longitude')} for photo in soup.find_all('photo')]
connector.executemany("""
INSERT INTO
DATAGERMANY
(id_photo, title, tags, latitude, longitude)
VALUES
(:id_p, :title_p, :tags_p, :latitude_p, :longitude_p)""", data)
Also, don't forget to actually call the close()
method:
connector.close()
FYI, the complete code:
import sqlite3
from urllib2 import urlopen
from bs4 import BeautifulSoup
url = "https://api.flickr.com/services/rest/?method=flickr.photos.search&api_key=5f...1b&per_page=250&accuracy=1&has_geo=1&extras=geo,tags,views,description"
soup = BeautifulSoup(urlopen(url))
connector = sqlite3.connect(":memory:")
cursor = connector.cursor()
cursor.execute('''CREATE TABLE DATAGERMANY
(id_db INTEGER PRIMARY KEY AUTOINCREMENT,
id_photo INTEGER NOT NULL,
title TEXT,
tags TEXT,
latitude NUMERIC NOT NULL,
longitude NUMERIC NOT NULL);''')
data = [{"id_p": photo.get('id'),
"title_p": photo.get('title'),
"tags_p": photo.get('tags'),
"latitude_p": photo.get('latitude'),
"longitude_p": photo.get('longitude')} for photo in soup.find_all('photo')]
cursor.executemany("""
INSERT INTO
DATAGERMANY
(id_photo, title, tags, latitude, longitude)
VALUES
(:id_p, :title_p, :tags_p, :latitude_p, :longitude_p)""", data)
connector.commit()
cursor.close()
connector.close()