There are currently two solid alternatives to traditional AV. Unfortunately, one is not suitable outside of a well-managed (i.e., corporate) environment and the other probably would not work in a full-featured computer environment.
1. Whitelisting: Application whitelisting is really, really effective. There are ways to circumvent it, but that's true of just about any technical security control. The problem with it is twofold: one, someone needs to develop exactly *what* that whitelist is, and the average home user isn't really up to the task. Bit9 (the leader in the space) has gotten around this to some degree with a cloud-based archive of "known good" files and processes, but your standard home user will still run into a lot of things they don't recognize when they install. And what if one of those things is actually an existing infection? Then they will probably add it to their whitelist...or, on the other hand, err on the side of caution and end up breaking valid software on their systems. The odds of them hitting it exactly right are very small. And even then, they have to maintain the whitelist...so if they're taken in by that "YOU NEED TO UPDATE YOUR VIDEO CODEC LOL" popup window, they'll invariably end up authorizing whatever file gets downloaded ("'Trojan_video.exe'...sounds legit to me!") and infecting their system anyways.
2. The "Walled Garden" Model: In a lot of ways, this is like whitelisting built into the underlying OS, with the OS manufacturer being the custodian of the whitelist. This is how iOS works, so it's actually a proven model. There's only been one discovered instance of malware that's slipped into the App Store, and that was easily eradicated with the press of a button back at the Apple mothership. But on the other hand, there are ancillary effects to forcing all devs to go through a single clearinghouse for software. Apple's cut of the profits, and their cut of any revenue passing through any app sold through the App Store, are obvious issues, but the antitrust risk of a PC OS with only one place to go for software is a latent...and larger risk, going forward. One court decision can break the model entirely; if Apple doesn't collect at least some money from developers, then there's no money to support the App Store and the activities around it. But if there's no central authority, then there goes the chain of trust that's necessary to maintain the safety of the OS. And there's complexity in a PC-based OS environment that you don't find in a tablet or smartphone; in the tablet/phone model, each application is an island, separate onto itself for the most part. You don't have browser plugins, underlying execution environments or interpreters (Air, Java, .NET, Python, Perl, etc.).
Either way, the "blacklist" approach doesn't work. It's all fine to point out that other things (firewalls, IPS, etc.) need to be in place, and that's true...but malware is its own threat, and cannot be fully addressed by solutions that only focus on the attack. Applications will have vulnerabilities; railing against this hasn't accomplished anything in two decades. People will make mistakes, or be social-engineered into doing things they should not do. Supply chains will become infected (remember cameras, USB drives, etc. that have come with malware?) and sometimes those mistakes will affect people besides the mistake-maker. So there needs to be a way to address malware itself.
There are two approaches that, while theoretical, also hold promise. The issue is that they are pretty much theoretical; there's no existing implementation of either of them on any scale, or as a deployable off-the-shelf technology today.
3, The Managed Immunological Response: Assume that malware will exist, and somehow get onto systems. Most complex organisms hold pathogens within themselves that are harmful...and in many cases, even contain them in a symbiotic relationship. Eradicate E. Coli from a human's lower GI tract and they'll develop problems, for example...but E. Coli outside of that part of their body causes major issues and is a health problem. Catch a cold, and you'll be sick for a bit...but your body will get over it. This is what some researchers are aiming towards, and the approach shows a lot of promise in theory. But it requires that the OS operate in a functionally different way, a way that does not currently exist. So...yeah, that's a ways off, if it will ever happen.
4, The Sandboxed World: This is where applications are walled from one another...this is another feature of the iOS model. And as with the Walled Garden, the challenges of this grow severely when you move to the PC world. If it's hard to exchange data between your email client and your word processor, you're going to have a hard time getting things done. This is already something of a nuisance in the tablet/phone world. But if you open up access to the file system, then you create an avenue for bad things, and punch holes in the sandbox walls. So I don't know if it can be fixed in a way that would suit PC users, or if, in a lesser implementation, it could support something akin to the Managed Immunological Response model.