问题
I need to log data from sensor as fast as possible. Therefore I start a process for continuously writing data to MySQL database by a parallel running thread in python.
After I started this thread, I need to frequently fetch some data of the database, which have been inserted of the active thread.
The issue is, that I can´t get acces to the data while the thread is running. after I stop the thread, I can read the data out of the database. I also tried to "commit" the writing every on every row, but it still does not work.
Here is my test-script:
import mysql.connector
from mysql.connector import pooling
import datetime, json, threading
mydb = mysql.connector.pooling.MySQLConnectionPool( pool_size = 3,
host = "localhost",
user = "user",
passwd = "password",
database = "any"
)
connection = mydb.get_connection()
connection_1 = mydb.get_connection()
my_cursor = connection.cursor()
my_cursor_1 = connection_1.cursor()
def continous_process():
while True:
cmd = "INSERT INTO config (config_json, date) VALUES (%s, %s)"
date = datetime.datetime.now()
values = (json.dumps("something"), date)
my_cursor.execute(cmd, values)
my_cursor.execute("COMMIT")
def get_last_config():
my_cursor_1.execute("SELECT entry_id FROM config ORDER BY entry_id DESC LIMIT 1")
print(my_cursor_1.fetchall()[0])
my_thread = threading.Thread(target=continous_process)
my_thread.start()
print(my_thread.is_alive())
while True:
print(get_last_config())
Is it necessary to release the connection by connection.release()
after each entry?
Im looking forward to a performant solution, which is alsmost "realtime".
来源:https://stackoverflow.com/questions/62166481/mysql-multiprocess-fetch-data-while-writing