I'm planning to add some functionality to my Django app that will write a row of data to a csv file when certain views are called:
session_dict = collectUserDataFromRequest(request) #....collect parameters from request
with open(filename, 'a') as csv_file:
a = csv.DictWriter(csv_file,session_dict.keys())
with open(filename, 'w') as csv_file:
w = csv.DictWriter(csv_file,session_dict.keys())
# then proceed to render the html
A problem I am facing is dealing with potential race conditions when multiple users are accessing my app at the same time. While the data can be added to the csv file in any order, I don't want some user's to wait longer for a view to load while waiting for a lock to be released on the csv file. As such, I think an Asynchronous queue of string data to be written to the file that can run in the background sounds like a good workaround.
I have been told that Celery is a good framework for adding asynchronous processes to my Django app. But looking through several tutorials, I am seeing that additional frameworks such as Redis are usually required. I worry that this may be overkill for what I aim to accomplish. Will I get away with using a standard multiprocessing approach, such as explained here or are there other drawbacks to using multi-threading in a Django production environment.