You're confusing low-end with outdated. An ARM Cortex-M3 or M4 board would be a low-end board suitable for tasks such as motor control, while being reasonably modern, and cheaper than the Raspberry Pi. An ARM Cortex-A5 or higher would be modern and suitable for running Linux. ARM11 isn't low-end, it's high-end and outdated.
Raspberry Pi suffers from exactly the same problem as the Arduino: both are based on an ancient, woefully outdated platform. Just because performance is "good enough" for whatever your idea of "good enough" is, doesn't mean it makes any sense whatsoever to stick to cores that are 10 years old or older. Moving up to moder modern designs give you more bang for the same buck, or less buck for the same bang. In the silicon industry it just makes no sense whatsoever to lag behind 3 generations for something like this. Newer designs are built in newer process nodes, scale to higher frequencies, and cost less to manufacture for the same performance. Being at the bleeding edge of silicon is expensive, but drop down a generation or so (relative to whatever field you're interested in) and that's the price/performance sweet spot. Using older stuff just doesn't make sense.
This keeps happening over and over and over again. When I started embedded programming, back when the PIC16C84 was released (the first microcontroller to feature EEPROM program memory, soon followed by the PIC16F84 Flash version), it stirred up a hobbyist revolution. No longer did you need expensive EPROM burners, UV erasers, and expensive UV-windowed chips with an erase cycle measured in minutes! And yet 5 years later people were still using the same damn PIC16F84, with its sole timer and just about no other features, when you could buy a PIC16F88 for 2/3 the price and get three timers, built-in analog-to-digital conversion, serial port/UART, SPI/SSP, PWM, analog comparator, built-in 8MHz oscillator, more RAM and Flash, ... Why? Because PIC16F84 was popular and people were scared to use anything else, even if it is almost a drop-in replacement.
Then the Arduino happened, and even more people people joined what became called the maker movement. And us longtime PIC users rolled our eyes because we'd been doing it for years and we didn't need no steenking breakout boards for a trivial 8-bit chip, but hey, C compilers for PICs sucked, and AVR was a better architecture anyway, and so Arduino deservedly became popular. But then the silliness started to set in again: ARM came up with Cortex-M3 and Cortex-M0, and you could buy a 32-bit chip running at 4x the clock rate for the same price as the AVR in the Arduino, and yet even today people keep using AVR-based Arduinos when the microcontroller world has moved on. People are even sticking FPGA shields on an Arduino, which is like sticking a GTX970 on a Pentium MMX. You could implement the entire AVR inside that FPGA and run it faster than the real one sitting underneath. Why this madness? Because Arduino is popular and people are scared to move on.
And now with Raspberry Pi it's the same thing all over again. When the Pi came out it almost had a good excuse, because, even though its CPU was obsolete, and Broadcom's idea of making a powerful GPU chip and sticking an old CPU "on the side" was dumb, let's face it, nobody was building Linux-capable SBCs at that price point. But that's no longer the case, you can buy much more capable boards for the same $35 today. Why on earth would they release an updated model with an updated chip in 2015 that still uses the same damn architecture that is 12 years out of date? It just makes no sense, the only reason I can come up with is internal politics at Broadcom (trying to sell off outdated chips/designs for cheap, resistance from their GPU division to having a more powerful CPU in there, or something like that).