Because everything can be filed under ‘Miscellaneous’
Because everything can be filed under ‘Miscellaneous’
Always a glutton for punishment, a recent project to redevelop a website involved changing pretty much every layer of the stack – Apache became nginx, mod_perl became FastCGI, and the custom code was moved into a Catalyst project. Having never seriously worked with any of these before, my first deployment (today!) was an interesting affair. (more…)
It seems like everyone is doing it – I know of at least two other codebases for a blog controller – which makes me wonder: is there a repository of drop-in components for the Catalyst framework, or are we all doomed to re-invent the wheel? I did ask on #catalyst about it, but having not thought it all through properly I asked if there was a CMS system built on Catalyst (which is something completely different) (more…)
I’ve spent the last week away, where I’ve had absolutely no Internet access at all. No WWW, no RSS, no email – not even any Twitter (although I did tweet a couple of times from my phone). It’s surprising (or not) how much “connected” time I’ve got used to – it was like a week out of time, not even knowing the news (the only TV I get to watch these days is CBeebies…)
I was just catching up on my email/RSS backlog when a thread (you’ll probably have to join the group to read it) on the Perl Mongers LinkedIn group about Perl caught my eye – especially as the participants were talking about Perl CGI/FastCGI/mod_perl.
Bottom of the heap came Perl/CGI – somewhat unfairly, IMHO. For me, there is a time and a place for Perl/CGI – although it needs the entire Perl process set up and the script interpreted for each call, it’s perfect for the occasional hack and/or stuff that’s still in active (prototype) development stage. Not having to restart/reload servers to make changes visible can be a big plus when you’re sat there with the client/end-user – you get the feedback, hack the code, hit refresh. Of course, once you’re done (and especially if the script is going to be public facing and get some hits) you should probably redevelop it into something more “solid”, but for the one-off script it’s the quickest and easiest path.
FastCGI seemed to be well received as a solution. Now, I’ve only just started looking into it, and I didn’t get much done last week, but my (current) understanding is the the “script” is interpreted and runs as a daemon, accepting the requests at a handler subroutine. Because the interpreter is persistent, there’s no setup or interpretation delay for each call, which makes it quicker than Perl/CGI. The number of child handlers (and so resource usage) can be controlled by the process, which makes resource allocation easier (for the reason that…) As the Perl FastCGI “server” runs separately from the HTTP server process (communicating using sockets), potentially as a completely different user, security (and resource allocation, as per the previous point) is easier to manage. You can use TCP/IP sockets for the FastCGI “server”, which can allow you to split the stack even further into “application servers” (running the FastCGI app) and “front-end HTTP servers” (which can just proxy to the application servers). With a mix of static and dynamic content, you can easily see where you need to beef up your servers – static stuff will hit the HTTP servers, dynamic stuff will hit the application servers. Throw in checks by the front-ends to make sure they’re getting the data (potentially trying all of the applications servers in turn until it can get one that can actually server the request) and you’ve got the recipe for a site with pretty good availability. The only fly in the ointment here is that the FastCGI “server” needs to be reloaded each time the code changes – not so bad if you have got the front-ends doing availability checks.
I’ve left mod_perl to last, for a couple of reasons. I’ve been using mod_perl on extremely busy sites for several years now, I don’t consider myself to be any kind of expert with it, but it’s served me (extremely) well. It has its pros and cons, and so far the balance has tipped in mod_perl’s favour – I guess I’m just used to the caveats. mod_perl, like FastCGI, only interprets the Perl code once per worker (thread/fork), so you get the same speed increase that FastCGI affords. The mod_perl stuff though, as its name implies, is kept within each worker, so the memory usage of each thread/fork does go up, and everything runs as the same user as the Apache HTTP daemon. mod_perl code can also be tightly coupled to the Apache HTTP daemon, making it harder to migrate in the future, but it does allow for some pretty low-level (in terms of protocol) access. Changing the code does require a reload of the Apache HTTP daemon (although a “graceful” one works fine, and is not so intrusive to the visitors) but you can pre-load a module that automatically checks for changes and issues the reload for you (useful for “staging” servers between development and live).
Now, the real fun comes/came when someone said:
If you evaluate mod_perl, you’d be evaluating something from an ancient world that is really an inferior solution. Most large scale Perl systems are using FastCGI.
“something from an ancient world” – well, mod_perl is (AFAIK) still being developed, and I still see it appearing in HTTP responses, so I wouldn’t consider it ancient.
“inferior solution” – guessing this is compared to FastCGI. There are some obvious benefits of FastCGI over mod_perl, there’s no doubt, but mod_perl’s tight integration with the Apache HTTP daemon allows it to do stuff that FastCGI couldn’t. So, claiming one solution is inferior to the other isn’t really fair.
And the “Most large scale Perl systems are using FastCGI” claim just screams for a “”. No evidence was provided for this claim, and it’s a pretty bold one to make – even qualifying it with an “IMHO” would make it better.
Other people in the thread say that:
I’d argue that mod_perl is not so much antiquated as just “out of vogue”.
Now that amused me quite a bit. Perl itself is a bit “out of vogue”, hence stuff like the Planet Perl Iron Man challenge to raise its profile a bit, so for mod_perl to be described as “out of vogue” (even amongst the Perl community) is a bit of a kick for it. 🙂
Original post “hit back” with:
mod_perl certainly isn’t dead, and it is a very viable solution if you need an API into the webserver itself (which ties you to apache). If you don’t need that, mod_perl is a really poor solution.
It’s fallen out of grace because of its own demerits, and hasn’t kept up with modern technologies.
More bold claims. I haven’t experienced any of that myself (apart from the tie to the Apache HTTP daemon), and I fail to see how mod_perl hasn’t been able to leverage CPAN to keep up with “modern technologies”. Maybe I’m missing someone more fundamental in that claim (like the internal coding of mod_perl)…
The fact that Catalyst already has support for running as FastCGI, and that I’m looking to refactor some code into a something that’s more manageable, my “move” to FastCGI should be quite interesting (not least of all because I’ll have performance metrics from the mod_perl versions to compare against). As part of the (greater) system also has PHP code, it’ll be good to have a single “acceleration” platform for both (rather than running both mod_perl/APC).
So, where do I stand? I’m using mod_perl, I’ll probably carry on using mod_perl for some time, but I can also see that the FastCGI route also has some advantages that I’m interested in exploring. I’m also going to keep hacking Perl/CGI scripts as well, probably just out of spite. 😀
So, it’s 2010 (however you like to pronounce it), it’s out with the old and in with the new! I must confess to having lived in a bit of a bubble for a while, not venturing out of my comfort zone, and so things got a little stale. No longer. (more…)
(Found this in my home directory – it came up on an IRC channel I was on, and I was bored)
rar, tar, zip, arc, dmg
ar, shar, cab, arj, rle
While I was waiting for a compile/unit test to finish, I noticed that one of the failed hard disks (a Maxtor MaxLine Plus II 250Gb ATA/133 HDD) I was using as a paperweight had two stickers on it. Having nothing better to do, I pulled the top one off, just to see what the changes were. (more…)
Just coming down from putting Alfie to bed, and there’s blue lights flashing outside the house. I live on a road that’s regularly used by the police/ambulance/fire service, so it’s not unusual.
I’m taking the bin bags round to be collected in the morning, and there’s a police car stopped opposite the house. There’s something in the road. Just then, Becki comes flying out of the door and shouts at the policeman.
Becki had seen the flashing lights and looked out of the window. She’d also seen the thing in the road, and she’d recognised what I’d recognised – Izzy’s collar.
Someone had hit Izzy with their car, hadn’t even bothered to stop. Worse, no-one else had bothered to stop either, just driving round her body. The police were just moving it to the side of the road (what else could they do?) when we’d come on the scene.
Judging from the fact that she was still in the middle of the road, she must have been killed outright, so that’s a small mercy. It still didn’t make it any easier to tell Alfie that his favourite cat, the one that would always come over to see him and would play with him in the garden, was never coming back.
We’ve just adopted two new cats – got them today, and they’re settling in. (more…)
Google’s “I’m feeling lucky” button on their search was amusing when it first came out – you don’t get the search results, but go straight to the top hit. Useful for when Google searches actually came up with useful stuff instead of adverts (Back In The Day) but not so much now. It looks like this feature is trying to spread, but in strange ways… (more…)