Genuine question - if people honestly don't care, then is it really a problem?
The problem is that users are given a tradeoff: either they enable cookies and let people track them, or disable cookies and break all the sites they use. Offered that decision, most people will rationally opt for the latter. The goal is to give them a third option: let sites work properly without privacy or security problems.
Web standards try to give apps as much power as possible without hurting privacy or security more than before, so you don't have to trade off here-and-now features to fend off abstract threats. Other application frameworks, like conventional binaries, don't even try: if you run the program, you have to trust it completely.
An example of one technology that tries this and gets it wrong is Android. You can decide what privileges to give an app before you install it, but popular apps often ask for lots of unreasonable privileges, so in practice most people ignore the risks and just install the things. On the web, applications can do the large majority of useful things Android apps can do (if you count cutting-edge standards that aren't widely supported yet), but few of the harmful things. This puts users in a much better position: they don't lose many features, but they're at much lower risk.
So, yes, it is a problem, and it is fixable, and the web is the only way forward toward fixing it. Others have tried, like Bitfrost, but only the web has enough momentum to build a real application base around the idea of totally untrusted applications that are still really useful.