Apparently I'm the only one on earth who doesn't like multi-player and has zero interest in playing a game with a bunch of random strangers on the Internet
Oh, you are definitely NOT the only one. The vast majority of gamers play single player games, some of which dabble in multi-player. It's just that the MP communities are the loudest voices. They are the ones that have to come together to play so naturally they will form a community where a single player gamer is unlikely to. Games like Titanfall or Destiny had MASSIVE marketing budgets so they seem more indicative of the market than they are. In both cases they haven't been hugely popular. That's not to say they are bad, far from it, if you have an interest in that type of gameplay you will probably like them, but the percentage of the market that is seems to be massively oversold. Personally I attribute Titanfall's lack of success to its pure multiplayer focus. Last I checked Call of Duty and Battlefield have more players that don't play online than ones that do.
What kind of lunatic plays his Game Boy games on an emulated adapter for a different console entirely instead of just using a Game Boy emulator?!
Someone who wants to see the Super GameBoy enhanced features some games had.
All I ever get is hot, uncomfortable, sore and annoyed.
I think that's true of everyone it's just that the morning gym types get pleasure from rubbing others noses in it. The "I cycled to work this morning" and "I did X in the gym this morning" folk seem to spend all day with lower productivity and tend to be more prone to moodiness. The staff I work with were better before their health kick. I'd prefer they waited until after work, makes my job more pleasant.
Because we're still living in the '50s where every household has only one tv.
So many parents force the consoles to be in the main room, because "it's for the family" or "we don't allow the children to have TVs of their own". I consider it a form of child abuse, but it's common.
Can these Samsung Smart TVs be made to ignore all the convergence stuff and just be a monitor?
Last I checked, you needed a network connection for this stuff. So all you need to do is... not plug in the network cable. Or configure the wifi.
So just use it as a TV and you're golden.
You know what's funny? I have a Samsung 40" series 6 that I can't remember the model number of - really early smart TV that's not worth the effort to use as one so I didn't bother to hook up the ethernet when I moved. At least once a month it would lock up, as in picture and sound still going, but the UI would either stop responding, or respond but not do anything requiring a hard power off to fix. Since hooking up the ethernet again it hasn't done that in nearly a year. Tin foil hats at the ready...
How is it different from every iPhone, iPad, Android phone or tablet or laptops with webcams, recording even your location, video and audio?
Because they don't. Your android phone activates the webcam as requested for activities that use it. They don't run them 24/7 due to the battery sucking nature of them. As I understand it, the Kinect is ALWAYS listening and ALWAYS recording because it is sitting waiting for you to speak the command words, or wave the right gesture. Sure you'd expect that this is just recording to a circular buffer which gets thrown in the bit bucket when it doesn't detect something on the whitelist, but years of experience has taught me that it's not going to be long before some hacker gets into the internals and finds a database that has recorded everything everyone in the room has said for the last few days. Emails will fly around Microsoft's HQ and they'll spin it as merely for "anonymous usage stats" or "essential algorithmic learning" but we'll know that yet again a company was caught doing something no sensible person would believe they would do AGAIN.
In the case of Apple, fans actually boast of the huge profit margins on each phone plus the fact that you can't do anything Apple doesn't allow you to is viewed as Apple protecting you.
True, however if you can't do it, the carriers can't do it to you either. Apple's control had a lot to do with carrier bloatware. We can fix the abused android phones, but consumers would rather it be usable out the box.
This is very, very slowly getting through to the managers, though.
I had a boss not too long ago who simply assumed that everyone who ever bought a product wants to get our newsletter. I warned him that we might end up on blacklists, he chose to belittle my being a scaredy-cat and ignore me.
Last I heard is that he's fighting a losing uphill battle to get off the various spam blacklists because NONE of his emails get to their recipients anymore, and he noticed that it's not building trust in a company when you have to phone a possible business partner who has a commercial spam filter to tell him that he has to dig through his spam for your mail.
Unfortunately most businesses seem to realise this is going to be a problem, and rather than not sending spam in the first place, they just ensure it comes from different mail servers and a different domain to their normal operations.
If you are a business you HAVE to. From the start I made my mailing list completely opt-in. That doesn't stop AOL users from using the spam button instead of the prominent link at the top that gracefully removes them from the list. You can't have customers not receiving order confirmations or order updates or have business email blackholed because some webmail users decide they don't want your mail anymore.
Almost no one can hear a difference between loss-less and any of the codecs at high bit rates (256K+).
I wonder how many of the "I can hear the difference" crowd are comparing old MP3s to lossless rips. I can hear the difference between my old MP3s and modern LAME encoded versions of the same source. Can I tell the difference between modern LAME high bitrate MP3s and FLAC? Only when I know ahead of time which is which!
Warning, pure speculation follows based on a very brief time working in the games industry.
The PS2 was notoriously difficult to utilise compared to the PS1 and the Dreamcast, but over time it managed to hold its own against the more powerful gamecube and xbox. At the risk of hugely oversimplifying what made the PS1 manage to hold on so long was that it had a dedicated vector processor which meant that the competition's (N64, Saturn) faster processors mattered much less. (The N64 used the main CPU for just about everything which made the 90-odd MHz MIPS much less impressive.) The PS2 architecture was an evolution of the PS1 by adding more dedicated vector units rather than going the T&L GPU route which was just about to hit the big time.
The PS3 swapped vector processors for the Cell which was an obvious choice considering. However all ATI and nVidia GPUs have their own vector processing capabilities and I'd imagine that the costs of developing a special PS3 GPU that gave proper emphasis to the Cell were HIGH rather than making a CHEAPER GPU which must have been the intention. So the Cell became half redundant. And with all the compromises that were made to get the costs down it wound up with too much power in one narrow field, no memory bandwidth and no unified memory and a weaker GPU than the 360.
The 360 used a plain architecture that could be leveraged relatively easily from the get go, but has a lower potential for hidden magic. The PS3 was designed to have the potential to blow it out the water but the reality is that no one has found any hidden stores of power. Much like the Itanium, it was only better in theory while in reality it was a struggle even to match the competition. The end result is that developers have to work harder to match the 360 excepting a small number of rendering effects which become easier on the PS3.
Contrast with Mac's F9, F10, F11 and F12 keys. If your program just happens to use one of those keys, you're shit-out-of-luck (as is the case when trying to debug something in Visual Studio in a virtual machine, for example).
You can use Cmd-F9/10/11/12 to avoid the expose stuff. OS X sees that as a different combination so doesn't fire expose but VMware passes the F-key unmodified to the VM which seems like an oversight but has got me out of a number of jams. If not using vmware YMMV.