RISC was not supposed to be a religion, which is what it seems to have turned into. No one should even ben arguing the point today because modern chips are so different from when the term 'RISC' was new. The whole premise behind RISC is being used extensively in modern CISC machines. The problem with trying to keep a RISC vs CISC debate alive is that is harms the education of the students.
RISC is primarily at its core about eliminated complex infrastructure where you can and reusing the resources for things that really can improve performance. Remember thay when RISC was first being used that the primary CISC machine at the time was the VAX (x86 still being a toy). The VAX operated primarily with a micro-architecture with the instructions implemeted in microcode. It had some amazingly complicated instructions, including one for helping compute polynomials. RISC researchers wanted to reuse those transistors for other purposes: more registers, more cache, pipelining, etc.
In fact, RISC was already well underway before 'RISC' was coined as a term. Many of the super computers and super-minis of the day already used similar techniques. One of the tradeoffs to overcome was the ease of programming in assembler versus performance, another tradeoff was memory space versus performance. Much of the complexity in CISC machines was also to support the high level architecture: paging systems, memory protection, IO subsystems, which is one major reason why the VAX was designed the way it was. Compare to the microship CPUs at the time; RAM was expensive so they also were primarily CISC in order to squeeze the most out of each instruction.
At the time the machinery necessary to support the instruction decoding was a significant part of the typical CPU design, so shrinking that gave you a big win. Today this is no longer true, the decoding and micro-architecture on Intel chips is essentially a trivial fraction, they even have on-chip caches with more static RAM than an early 80s era CISC decoder.
However it may not be true everywhere. We still have small embedded and low power chips where this stuff does matter. ARM7TDMI is a popular chip which relies on RISC in order to keep everything simple and small, no space for caches or instruction queues, so having a simple instruction decoder is a big win in keeping it simple and low power. Most of the small low power chips today for embedded use are clearly RISC derived for that reason, PIC, AVR, etc. The article here is talking about the big beefy desktop systems, which are just a fraction of CPUs being used.