There's a lot of trash talk among professional web programmers regarding using vanilla CGI, like Stevan Little's recent comment "There is no excuse to still use vanilla CGI as it is simply just a waste of resources".
As an experienced professional website developer myself, I find that CGI has its place. First, let's recap what we're talking about.
In "vanilla CGI", a Perl interpreter is loaded each time a request is made, the code is compiled and run on the fly, the content is delivered, and the whole process shuts down again.
In FastCGI or mod_perl, the interpreter along with some compiled application code can remain persistent in memory between requests, so just the run-time effort needs to be done for each request.
For anything that's particularly high traffic or a complex application to load, the persistence feature will be a win. Performance is dramatically improved because loading the interpreter is done once and the compile-time effort is not repeated.
However, there are other considerations besides performance that still make vanilla CGI the better option for some applications, particularly on low to moderate traffic applications.
In case there's a lack of imagination about applications that may always be small or low traffic, here's some examples:
- A company provides a few dozen sales reps a private area to access dynamically generated spreadsheets. As a private application with limited users, traffic is always low. Beyond this, the company has a tiny web presence.
- A college has a customized online application to an off-campus program. Less than 100 people submit the application each year.
- A small business needs a customized contact form and a custom event registration form. Each are expected to be used10 times per day or less
- A university needs a public search engine for a database on a niche topic.
I have helped build and deploy applications like each of the above in vanilla CGI, with reasonable performance. There are many website needs out there which are unique enough to merit custom programming but are neither large, nor destined to become high traffic attractions.
This brings us to some specific benefits of vanilla CGI:
Hosting Availability. Googling for "CGI Hosting" today I get 166,000 results, but searching for "FastCGI hosting" I get just 225. Plain CGI hosting is far more widely available. If you just need to add one dynamic form on your site that is lightly used, do you want to switch hosts just for FastCGI support? For low traffic applications, a FastCGI site could chew more memory while the persistent part sites idle, while the equivalent CGI system would completely free up the memory. A host can support more plain CGI customers (again, assuming low to moderate traffic), because their memory is constantly being freed up for use for other customers. A persistent FastCGI process has an persistent memory cost for the host.
Persistence-specific bugs. Generally, well written code will run in a persistent environment with no problems. Still, there a number of hang-ups you have to aware of which aren't a concern in CGI scripts. Just look at the extensive amount details documented for people moving from vanilla CGI to mod_perl. It's not that has to be hard or time consuming to write Perl for persistent environments, but there are clearly extra considerations.
You can easily scale up, but not down. Using CGI::Application with it's built-in support for CGI.pm and HTML::Template, you've got a great framework of lightweight components for building applications that will perform well in vanilla CGI. They will generally also work in a persistent environment without any changes. This is what I do myself. When a mod_perl project comes along, it's easy to scale up using Titanium with additional plugins to provide more features, with the same familiar framework of CGI::Application underneath. On the other hand, there's the choice to develop by default with a framework designed for a persistent environment, like Catalyst. II'm not aware of any of these heavier frameworks that have an option to scale down in a way that performs well in plain CGI will still retaining the same feel and features. Such a path is a commitment to either also deploy smaller, low-traffic applications in a persistence environment because the heavy framework requires it for decent performance, or you'll need to learn a second framework just for small apps. I'd rather use one framework that performs well in both CGI and persistent environments.
In summary: Simplicity. Vanilla CGI is the simplest to code for and deploy. If performance is good enough, why not use the simpler option?
Update-- it was pointed out my examples of "low traffic" were all very low traffic. A more interesting question is what the upper limit for traffic is, before vanilla CGI performance degrades. My recent benchmarks inform this. A "Hello World" CGI::Application project benchmarks at 0.20s. So, being very generous it's reasonable to assume that a complete CGI script written this framework could be expected to be complete and shutdown in under a second. Which means a rough upper bound is 1 CGI request per second. Busy websites regularly exceed this request rate but I think that illustrates that vanilla CGI is a capable performer for many uses.
- Ian Bicking's post What PHP Deployment Gets Right pins some of PHP's success on its CGI-like model.