I would like to store Python objects into a SQLite database. Is that possible?
If so what would be some links / examples for it?
As others have mentioned, the answer is yes... but the object needs to be serialized first. I'm the author of a package called klepto that is built to seamlessly store python objects in SQL databases, HDF archives, and other types of key-value stores.
It provides a simple dictionary interface, like this:
>>> from klepto.archives import sqltable_archive as sql_archive
>>> d = sql_archive(cached=False)
>>> d['a'] = 1
>>> d['b'] = '1'
>>> d['c'] = min
>>> squared = lambda x:x*x
>>> d['d'] = squared
>>> class Foo(object):
... def __init__(self, x):
... self.x = x
... def __call__(self):
... return squared(self.x)
...
>>> f = Foo(2)
>>> d['e'] = Foo
>>> d['f'] = f
>>>
>>> d
sqltable_archive('sqlite:///:memory:?table=memo' {'a': 1, 'b': '1', 'c': , 'd': at 0x10f631268>, 'e': , 'f': <__main__.Foo object at 0x10f63d908>}, cached=False)
>>>
>>> # min(squared(2), 1)
>>> d['c'](d['f'](), d['a'])
1
>>>
The cached keyword in the archive constructor signifies whether you want to use a local memory cache, with the archive set as the cache backend (cached=True) or just use the archive directly (cached=False). Under the covers, it can use pickle, json, dill, or other serializers to pickle the objects. Looking at the archive's internals, you can see it's leveraging SQLAlchemy:
>>> d._engine
Engine(sqlite://)
>>> d.__state__
{'serialized': True, 'root': 'sqlite:///:memory:', 'id': Table('memo', MetaData(bind=None), Column('Kkey', String(length=255), table=, primary_key=True, nullable=False), Column('Kval', PickleType(), table=), schema=None), 'protocol': 3, 'config': {}}