If the problem is small, use a small tool.
If the problem is big... use the 'appropriate size tool'.
When I was a systems geek on mainframes, it was the dawn of the PC (pre Mac) era. There were and are problems that are better suited to big data bases with lots of computing power. Sometimes it can't be parallelized. At the time, there were mainframes with 6MByte/sec i/o channels - 24 of them, 8 4-core processors run by a single Operating System (or several, users choice). Yes they were tuned to the workload of the company. -- Getting rid of the 24 mainframes in 12 data centers around the world cost a boat load of money. It cost much more to add in the number of 'file servers', desktop PCs, workstations, additional networking(that was needed in any case), add in UPSes wherever 'data was critical', in addition to the numbers of 'trained administrators' to help install, maintain, and train the PCs even after users were trained. ... The old IT department was discarded, except for the most critical functions - printing paychecks.
The number of stories of people not backing up their PC because it was 'so reliable compared to the mainframe' and loosing years of effort, and still blaming IT even after being warned verbally and writing, was legon.
Mainframes are great. They are not all things to all people. They should be just ANOTHER tool in the tool bag of society to help get the job done.
(From the days of Beowulf clusters, PCs have been trying to figure out how to 'create a cheap mainframe'. Real mainframes are not just CPU power, they are I/O power, shared resources that can be dedicated in large or small portions to the task at hand. They never solve problems by themselves, but no computer does. It still takes systems designed, engineered, programmed, distributed, and maintained for the life cycle of the application system to make ANY computer useful no matter how large or small it is.)