Home Python sending data to a MySQL DB

# Python sending data to a MySQL DB

Andy
1#
Andy Published in 2017-12-07 21:45:39Z
 I have a script running, updating certain values in a DB once a second. At the start of the script I first connect to the DB: conn = pymysql.connect(host= "server", user="user", passwd="pw", db="db", charset='utf8') x = conn.cursor()  I leave the connection open for the running time of the script (around 30min) With this code I update certain values once every second: query = "UPDATE missionFilesDelDro SET landLat = '%s', landLon='%s',landHea='%s' WHERE displayName='%s'" % (lat, lon, heading, mission) x.execute(query) conn.ping(True)  However now when my Internet connection breaks the script also crashes since It can't update the variables. My connection normally reestablishes within one minute. (the script runs on a vehicle which is moving. Internet connection is established via a GSM Modem) Is it better to re-open every time the connection to the server prior an update of the variable so I can see if the connection has been established or is there a better way?
Jeremy Jones
2#
Jeremy Jones Reply to 2017-12-08 12:22:44Z
 You could just ping the connection first, instead of after the query, as that should reconnect if necessary. Setup: conn = pymysql.connect(host= "server", user="user", passwd="pw", db="db", charset='utf8')  and every second: query = "UPDATE missionFilesDelDro SET landLat = '%s', landLon='%s',landHea='%s' WHERE displayName='%s'" % (lat, lon, heading, mission) conn.ping() x = conn.cursor() x.execute(query)  Ref https://github.com/PyMySQL/PyMySQL/blob/master/pymysql/connections.py#L872 It's still possible that the connection could drop after the ping() but before the execute(), which would then fail. For handling that you would need to trap the error, something similar to from time import sleep MAX_ATTEMPTS = 10 # every second: query = "UPDATE missionFilesDelDro SET landLat = '%s', landLon='%s',landHea='%s' WHERE displayName='%s'" % (lat, lon, heading, mission) inserted = False attempts = 0 while (not inserted) and attempts < MAX_ATTEMPTS: attempts += 1 try: conn.ping() x = conn.cursor() x.execute(query) inserted = True except StandardError: # it would be better to use the specific error, not StandardError sleep(10) # however long is appropriate between tries # you could also do a whole re-connection here if you wanted if not inserted: # do something #raise RuntimeError("Couldn't insert the record after {} attempts.".format(MAX_ATTEMPTS)) pass 
Jarvis Cochrane
3#
Jarvis Cochrane Reply to 2017-12-07 22:31:19Z
 I'm guessing the script fails with an exception at the line x.execute(query) when the connection drops. You could trap the exception and retry opening the connection. The following 'pseudo-python' demonstrates the general technique, but will obviously need to be adapted to use real function, method, and exception names: def open_connection(retries, delay): for (x in range(0, retries)): conn = pymysql.connection() if (conn.isOpen()): return conn sleep(delay) return None conn = open_connection(30, 3) x = conn.cursor() while(conn is not None and more_data) # read data here query = ... while(conn is not None): # Loop until data is definitely saved try: x.execute(query) break # data saved, exit inner loop except SomeException: conn = open_connection(30,3) x = conn.cursor()  The general idea is that you need to loop and retry until either the data is definitely saved, or until you encounter an unrecoverable error. Hm. If you're sampling or receiving data at a constant rate, but are only able to send it irregularly because of network failures, you've created a classic producer-consumer problem. You'll need one thread to read or receive the data, a queue to hold any backlog, and another thread to store the data. Fun! ;-)
 You need to login account before you can post.
Processed in 0.374577 second(s) , Gzip On .