My guess is: They had purchased a 3G cell-site simulator, which is not a cheap piece of kit, so they had one. 4G LTE cell-site simulators are considerably more pricey. Even government-run "stingray" cell-site simulators often force phones to drop back to 3G to make life easier for the device.
What are the chances that the cell-site simulator the group used doesn't support power-saving features that the real cell towers provide? I suspect the chances are pretty good. In the name of "being fair," I think the magazine wound up producing results that bear no relation to the real world, unless you're under constant surveillance by an underfunded government agency that can't afford the latest and greatest toys from Harris RF.
If Which? has a simulator that accurately reproduces the behavior of each cell company's own towers and software, I'd love to see them document that. Until then, I'm going to assume it emulates a generic network, or possibly even a "good enough" network that doesn't optimize power the way a real network does—which we know is true of many "stingray" devices, which command the phones to go to full power regardless of conditions in order to make them easier to locate.
Otherwise, it's like saying "to test the battery life of these laptops, we loaded Jimbo's BIOS and our own operating system on them." The results would not reflect real-world usage where you'd be using a BIOS tailored to the hardware and an OS that supports the hardware's power-saving features. It might give you a certain relative ranking of battery capacity, but it doesn't tell you how the laptop you actually buy and use will perform.