Files
zulip/tools/run-dev-queue-processors
Tim Abbott cd2348e9ae Run queue processers multithreaded in development.
This change drops the memory used for Python processes run by Zulip in
development from about 1GB to 300MB on my laptop.

On the front of safety, http://pika.readthedocs.org/en/latest/faq.html
explains "Pika does not have any notion of threading in the code. If
you want to use Pika with threading, make sure you have a Pika
connection per thread, created in that thread. It is not safe to share
one Pika connection across threads.".  Since this code only connects
to rabbitmq inside the individual threads, I believe this should be
safe.

Progress towards #32.
2016-03-20 18:04:24 -07:00

21 lines
702 B
Python
Executable File

#!/usr/bin/env python2.7
# This script is only meant to be run from run-dev.py, which sets up the
# environment correctly and passes the correct arguments for manage.py. It is a
# separate script so that the import from zerver.worker.queue_processors (which
# is slow) can be done in parallel with the rest of the work in bringing up the
# dev server.
import sys
import os
import subprocess
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
from zerver.worker.queue_processors import get_active_worker_queues
queues = get_active_worker_queues()
args = sys.argv[1:]
subprocess.Popen(['python', 'manage.py', 'process_queue', '--all'] + args,
stderr=subprocess.STDOUT)