I made a management command that populates one of my models from a csv file.
I need to do this update quite frequently and the csv files have tens of thousands of lines.
Use Celery
Rougly, it may look like this:
app = Celery(<config stuff here>)
@app.task(name='my_task')
def my_task(self):
do_stuff()
def my_view(*args, **kwargs):
result = process_request()
app.send_task('my_task')
You'll need to create the task, register it with celery (there is some autodiscover magic you can use), then run the task asynchronously from your django app.
In production, you may want to run celery as a daemon process with celeryd
You can do the same with Django Background Task. Its a databased-backed work queue for Django. And is easy to implement than Celery.
from background_task import background
@background(schedule=60)
def your_task():
# do your cool work here.
This will convert the your_task into a background task function. When you call it from regular code it will actually create a Task object and stores it in the database.