I\'d like a unique dict
(key/value) database to be accessible from multiple Python scripts running at the same time.
If script1.py
updates
I would use a pub/sub websocket-framework, like Autobahn/Python, with one script as a "server" and it handles all the file communication but it depends on scale maybe this could be Overkill.
I'd consider 2 options, both are embedded databases
As answered here and here it should be fine
link
Berkeley DB (BDB) is a software library intended to provide a high-performance embedded database for key/value data
It has been designed exactly for your purpose
BDB can support thousands of simultaneous threads of control or concurrent processes manipulating databases as large as 256 terabytes,3 on a wide variety of operating systems including most Unix-like and Windows systems, and real-time operating systems.
It is robust and has been around for years if not decades
Bringing up redis
/memcached
/ whatever else full-fledged socket-based server that requires sysops involvement IMO is an overhead for the task to exchange data between 2 scripts located on the same box