Really? I hear this Android fragmentation concern all the time, and I wonder if those expelling this opinion were actually around for the last 30 years. For the entire history of PC computing, software makers routinely came out with brand new software that required the very latest cutting edge hardware. At first you had color games when people had only B&W. Then software required more than the 640k of memory most computers had. Then 16 color graphics when people only had 4-color monitors. Then new video cards came out frequently, and you either had the latest or you were getting 5 frames per second. Computers went from 16 bit to 32 bit, and this impacted all software. Operating systems began allowing you to run more than one app at a time, as long as you had hardware sufficient to support it. I could go on all day.
But the main take away is, it has worked out pretty well for PCs.
The good news is, most entire smartphones cost between USD $0 and $200. Which is much less than the cost of a single video card was ten years ago, when I was changing mine out annually.
If you want your software to run across a variety of hardware, it's going to take work. That's just life in software development. But let's stop assuming that everything written for Android absolutely must work on all phones. The Android market lets you control what devices your app is available to. Which by itself gives you the ability to avoid incompatibility issues in ways we could never dream of with PCs.