There is a different effect for TVs: If you sit close enough to the TV so the eyes' autofocus is not set to infinity, then more of the TV will be in focus when it's curved. This effect applies only to TVs, which are closer than the Far point of the eye, and not to cinema screens, which are further away. The optimal curvature depends on the distance from the TV.
Principle is the same as VNC, but the leap in technical sophistication is huge
There will probably be degradation of quality. From bandwidth concerns alone, there's no way they could stream uncompressed 1080p@60Hz, that would require 3 Gbit. By using something like 50Mbps they could get better quality than the ~8Mbps we se on high-quality TV streams, and could spare some CPU power by encoding less efficiently (also: decoding video requires power on the client).
In principle I'd think the clients would have problems displaying the video (this seems to be fixed if they're releasing it). Many low-end systems can't decode HD streams in real time with CPU only, and rely on hardware acceleration. There's a lot that can go wrong when displaying high quality video streams on linux: tearing, stuttering, A/V sync, etc.
It's a neat idea, but when I move, quite soon, I'll still prefer to pull a long-ish DVI (or DP if I can get a 4K monitor) and USB cable to have my gaming rig in a different room.
Kind of ironic how the IP video connection sucks so bad, for someone advocating full reliance on the network.
Peterson has a point, some admins refuse to even look at the cloud as an option. The "cloud hugger" metaphor is wrong though, the cloud is not a new version of the local server which is more efficient, performant and clean (sure, there were advantages to having horses too (vs. cars), but no notable advantages related to the main purpose, transportation). The cloud is just a different thing altogether, like an airplane vs. a car. A good admin needs to decide if outsourcing the operations makes sense for each case, also factoring in the costs (and hope the management trusts that decision). It's easy to take too much pride in one's craft, and insist on perfect solutions, when the business maybe only needs a fairly good solution.
At an infrastructure level, using "cloud" tools (i.e. virtualisation, management), is reasonably safe. These are reasonably portable across the remote / on premises boundary, though porting requires some effort.
At the application level, if the plan is to use cloud tools exclusively, it's easy to end up with inconvenient workflows or being stuck with some provider. Inter-operation between applications is sparse. Many cloud applications provide APIs, sure, but if you need a server to call APIs and synchronise data across providers, and the business becomes reliant on those scripts, have you really gained anything..?
Episode I was well done visually and a decent stand-alone adventure film. The original trilogy somewhat fails to engage me emotionally or intellectually. Episodes II and III were just boring. Sorry for the flamebait, this is my opinion, but maybe slightly exaggerated. I probably wouldn't bother with replying;)
Absolutely agree about quality. [I'd basically gone legal, but due to my financial situation I've gone on a bit of a torrenting spree recently. Always get the straight blue-ray rips if available. Storage is cheap and easy to manage...]
Just to add to the point about streaming, not many have tens of megabit connections, but additionally it would be quite expensive for the streamers to serve that quality. If you can have maybe 20 streams off a gigabit NIC, imagine the number of servers they'd need. It's not even clear if the economies of scale would work out for them on the technology side -- depends on where we'll see the imrpovements in the future (bandwidth/storage/etc)
So that's what is taking so long when starting Dota. I was wondering what part of loading a game could max out a thread on the CPU.
As an example, the time from starting Dota 2 until the time actually being within the game is reduced by about 20 seconds on an Intel system.
A WTF comment if I ever saw one. One would prefer at least two numbers to know how good the improvement is, though a percentage would also be better. On my Intel system Dota2 takes about 15 seconds now. And what's with the pointless Intel name-drop anyway.
Caching seems like a better solution to me, but multithreaded compilation is also good. Well done Valve
Percentage of online communication.. by number of bits: torrents dominate, and some HTTP downloads, and these are not encrypted. By my attention, there's more text-based communication, and I'm probably up at 50 %
I agree that hybrid storage is great, but it can "easily" be done in software (there's a couple of projects for Linux, like bcache, as well as ZFS, and there's an Intel driver in Windows). Then you can pick the size of the SSD and HDD at will, and optionally make a RAIDs of the HDDs and SSDs to mitigate against the increased failure probability.
When multiple drives aren't an option, in laptops, the problem with hybrids is that you lose out on the non-performance advantages of SSD: low power usage and durability. The controllers could improve on this, by shutting down the hard drive and doing more writeback caching, but current hybrids lose on these points. (my laptop has a 256GB SSD, which I find about a factor 2 too small. I can't sync my
How is this an advantage?
Assume for a moment that it's harmful if the data, including IP addresses, timestamps, unique IDs, etc, gets shared with the world. The data was previously inaccessible due to technology, now it's only limited by the policies of the holding company. Some people don't trust those policies (or the comany's security) as much as they did the old model.
So is it harmful? The timestamp / IP combos place you at a given place (most likely home) for a period of time. There are dozens of other companies with the same data, so unless you're being super cautious, it's not worthwhile to worry about Steam especially. If such data were searchable online though, one could have some problems with robberies and with lying about ones whereabouts. The fact that they count "playing a game", not just being on steam, doesn't help with accuracy -- some people just leave their game open while doing other things. The chat histories are similar to what any other chat company does.
Steam *could* do some nefarious things with the gaming-specific data. For example, they could sell it to employers and others who may be interested. The information isn't enough to ruin a life though, and if they use it for anything more evil than advertising they would lose all consumer trust
By the way, I assumed they were using trolling in the new, incorrect sense of being a mean asshole (in the chat, etc). Not really clear from the article what they actually mean by trolling
Ensure that people need each other. If people can treat others like an expendable commodity, they will treat each other as such.
Works both ways. In games like LoL players really need each other. It's 5v5 PvP, frequently with random people. That can mean a lot of more and less justified grief between the teammates. However, if it wasn't as easy for a player to screw it up for the team, maybe it wouldn't be as fun...
Good idea, but I hope they keep all existing systems in place, and make it optional. Graphics drivers are massively complex, and are probably a significant source of oops. If displaying a QR code means that the kernel needs to interact more with the drivers, and (oh god i hope not) change the resolution to display a QR code, then I expect more fail. People can take photos of the crash messages in 80x25 character consoles anyway, so let's not destroy that.