I've been trading-in my habit of writing one-off Perl scripts for writing them in C#. I just clone a template file containing enough to get a "while(<>) { ... }" equivalent and a vanilla regular expression going -- ninety percent of what my one-offs need -- and I'm off and coding.
The result is something that may or may not be faster, or clearer . . . but it's a whole lot easier to debug when something goes wrong.
I learned Perl about a decade ago, and have never felt that plumbing its depths (references were about as sophisticated as I got) would pay off. (I won't get into what I think is broken about Perl, because that discussion definitely won't pay off :-) )
Speaking of broken, that second link you give doesn't contain a meta tag specifying the encoding, and the webserver probably doesn't send it either, because I had to set my browser to UTF-8 manually to see the correct accents.
Stability, maturity, great backwards compatibility, whipuptitude, a sea of well-cared mature modules on CPAN, the ease and convience of CPAN itself, already installed/available everywhere, Unicode support, creativity and competence of the community, exceptionally good documentation, "the spirit", MOP via Moose if I want to, rarely gets in my way, scales very well in terms of "thinking" and "project" (everything from tiny admin-script up to full-blown financial district application possible), speed, amazing interesting features in perl 6....
And no, you don't write the same Perl as in 1996 anymore.
Not especially large ones, but yes, using it all the time. Want to exchange information between Active Directory and a relational database? Between Net::LDAP and DBI, Perl makes it easy. Need to rewrite quantities of old HTML to clean up unsupported tags? HTML::TreeBuilder is your friend.
Yes, it is still a great glue for large projects. Right now I'm building a custom public health database with data from many different sources - some of them PDFs and other files with really irregular formatting that require single-use scripts with complex regular expressions. Perl was my first language so I can always purr through those problems, and I'm really comfortable (happy even) debugging a Perl script.
I'm also maintaining a legacy bioinformatics project with about 30,000 lines of Perl in it and still making expansions on it. I won't defend Perl as the best choice for that project other than to say I was around some very good Perl hackers / sysadmin types and I picked up the only tool with a knowledge well nearby. It did however make me a much better programmer, especially since I ended up implementing a lot of functions manually that I probably should have just found on CPAN or used as a built-in method in another language.
Perl is also really great when you are manually splitting data and jobs across a few thousand nodes and locally tweaking the run parameters for each job. Now that I've learned a few more languages I know that sort of thing might be better done with Hadoop but for a large GPFS system I rarely had problems.
I will say that for memory considerations (the main ceiling I bumped into on the old cluster) object-oriented Perl sucked and was very slow. If you wanted to roll your own pseudo-objects with well designed nested hashes though you could really wail.
I just started a new project a week back with the objective to apply real time effects/filters to an audio stream.
Main choice of programming language is Perl. Firstly as I have more than 10 years of experience using it. And secondly as its a fantastic glue language. And of course CPAN. The modules I'm going to be using are indispensable - CGI::Application, Audio::Ecasound, POE, Net::LibLO, MIDI::ALSA and more. This is my first project where I'm going to be doing audio stuff and Perl has made the transition so easy!
I've been using Perl for more than 15 years in my company, where we've built projects using only Perl to create chat rooms, online forum, blog tool, webmail, web hosting control panel, email marketing and some other web tools that we offer to our clients.
I also use Perl for linguistics projects (text analysis, log processing, artificial intelligence, natural language processing, lots of regular expression). Many of our projects are more than 10.000 lines of Perl code.
I also have some personal projects (like my url shortener website www.bit.do) that were created with Perl.
My work tools are mainly Perl, Apache and Nginx web server and MySQL database.
Just recently I spent a day writing a Perl chat server (and jquery webclient to go with it) for a large customer of ours, sort of a beta-gimmick for a good customer of ours. Now the thing became "mission-critical" (esp. on Fridays...). There are like 500 intranet users connected at any given time! I pulled the whole stunt thanks to the Mojolicious framework, from this article here:
Just added chat rooms and intranet user authentication. Then installed EV (for nodejs-grade event loop) from the CPAN, and the thing was deployed ready in production. Mojolicious is just amazing.
Congratulations to all involved! They've really got things back on track in the last several years. Things stagnated somewhat during the 5.6-5.8 era, but the 5.10 releases brought large positive changes, both to the codebase and to the development process itself.
Perl is still the go-to language for tasks within its purview (as described by other commenters here).
Just thought I'd mention this in light of yesterday's discussion about simplicity/complexity: http://news.ycombinator.com/item?id=3995185