Actual 1080p isn't even here yet for a lot of media. Most games and TV stations still only use 720p, and there are quite a few movies in that mode as well. It's no surprise that no major content provider is considering 4K at this point.
One of the more interesting changes is the license switch from LGPL to zlib.
I suspect this was done due to the rise of SFML (Simple and Fast Multimedia Library).
Link to Original Source
I'm seeing a huge inconsistency between data 'theft' or 'stealing' and 'pirating' here on Slashdot. I read the article and didn't see any reference to the original data being deleted. Was it just copied or "pirated", or was it actually taken off the machine with the original data removed?
Seriously, why would anyone play games on a console anymore? You can do so much more to your games on the PC.
That wasn't directed at me, but here are my reasons:
1) I don't want to have to dual boot so that I can play AAA games. It saves me a Windows license, the hard drive space, the annoyance of multi-booting, etc.
2) Being able to just sit down on the couch with a beer, pop in a game, and play it lounging back without having a keyboard/mouse spread is relaxing. I do love playing PC games as well, but sometimes I just want to be sprawled. If I played competitively, I'd probably only play on PC.
3) I'm growing older and don't have the time or energy to upgrade my hardware in bits. I'd rather buy a new console for a few hundred dollars every 5-7 years instead of component upgrades every 0.5-1 years.
It also helps that my gaming isn't mostly twitch-shooters. If it was, that would also change how I probably played games.
Let's face it, there's no shortage of places that have some, part or all of your personal information these days; Steam is just one of many.
People or companies doing stupid or restrictive things en mass does not somehow make it right.
Purchasing a single-player game and having to tether it to a registration system is idiotic for the reason in the main article here. This continuing push to centralize all data in these private hubs is starting to show the flaws.
We’ve all lived the nightmare. A new developer shows up at work, and you try to be welcoming, but he1 can’t seem to get up to speed; the questions he asks reveal basic ignorance; and his work, when it finally emerges, is so kludgey that it ultimately must be rewritten from scratch by more competent people. And yet his interviewers—and/or the HR department, if your company has been infested by that bureaucratic parasite—swear that they only hire above-average/A-level/top-1% people.
1 - Yes, I am being deliberately sexist here, because in my experience those women who write code are consistently good at it.
I know it's socially cool to be anti-male, but come on.
I kind of regret using a Blu-ray player at times, because I'm nagged to be online to "Experience all the content" any time I want to watch a friggen movie.
Purely conjecture, but I believe it's less to do with "checking off a feature" and more to do with the following:
- Save time & money on content generation, since people who play multiplayer will use the same map over and over.
- Form of DRM / Piracy Protection, if there is 'server validation' then there's an indirect 'purchase validation'
Personally, I don't buy a game for multiplayer unless it's split screen, and those are few and far between. I'd play an older game like Goldeneye 64 with 3 buds long before playing any shooter over xbox live.