I am attempting to create a web application with Django that will let customers to operate C programs on the server. The steps that I am presently following are:

  • User types within the code and clicks the 'Submit' button
  • Code would go to server server puts it inside a temporary file
  • Code is put together utilizing a call to subprocess.Popen()
  • Output (error) is came back to the browser

This can be a simple model, and dealing fine. However, I am unsure if this sounds like the "perfect" model, and I have some concerns about it's scalability (and security):

  1. Will it be easier to use threading?
  2. Will it be easier to use multiprocessing?
  3. Fork explosive device problem was stated inside a related question elevated on my own here -- how to approach that? Would the next configurations in apache2.conf have the ability to handle such situations?

    # prefork MPM
    # StartServers: number of server processes to start
    # MinSpareServers: minimum number of server processes which are kept spare
    # MaxSpareServers: maximum number of server processes which are kept spare
    # MaxClients: maximum number of server processes allowed to start
    # MaxRequestsPerChild: maximum number of requests a server process serves

I am thinking about the situation when say, 5, 15, 50 customers would attempt to run their code in parallel. Also, for the moment am presuming that no malicious code could be written.

Simply to mention, mod_wsgi and mpm-prefork with Apache2 are used.

Thanks for your suggestions!

Maybe Celery project can help you.

Celery is definitely an asynchronous task queue/job queue according to distributed message passing. It is centered on real-time operation, but supports arranging too.

The execution models, known as tasks, are performed at the same time on one or even more worker servers using multiprocessing, Eventlet, or gevent. Tasks can execute asynchronously (without anyone's knowledge) or synchronously (hold back until ready).