What I really don't get in this write-up is the insinuation that a focus on (purely) functional programming is a "recent trend".
There is a new area where functional programming does shine, and it's large scale analytics. Sometimes when you think about solving a specific problem, you can go about it in a few different ways, but almost every time if your solution later needs to be parallelized (i.e. running on a Hadoop cluster) the functional programming way will be easier to adapt.
For instance, let's imagine a situation where you have a small e-commerce website and the marketing team wants to know what are the most common sequences of pages visited by users. You could write a quick & dirty python script that parses the logs and creates a hash of every possible sequence, then you could use that to "rank" sequences by time, browser, location, etc. Or you can play the fancy card and use something like Petri nets in a map-reduce-ish kind of way. Both approaches work.
But then your small website becomes a big success, and grows and grows and grows, and one day your script runs out of steam. So you figure, let's run that bitch on a big Hadoop cluster. Well guess what, a script that is map-reduce friendly will be a lot easier to adapt for that.
I'm not saying every single situation warrants for this kind of thinking. But that qualifies as a kind of problem that is fairly new for mainstream programmers.