Comment Latency? (Score 2) 25
How will this not be nausea-inducing if the wi-fi connection alone adds a 20-30ms delay, plus whatever additional delay from protocol, encoding/decoding, etc.?
How will this not be nausea-inducing if the wi-fi connection alone adds a 20-30ms delay, plus whatever additional delay from protocol, encoding/decoding, etc.?
exists the Console market? It's high cost / high risk. Their core business is still insurance.
Nearly 1/3 of Sony's total revenue comes from the Game & Network Services division as of 2022. About 1/6 of their game sales revenue from the same period came from first-party exclusives.
They've never been able to compete on FPS'. [...] Sony doesn't have anything nearly that memorable (Sorry Ratchet & Clank fans, I love it too but the movie was the blandest and dullest thing I've ever seen...). Sony's gonna get boxed in and shut down.
Right, it's not like they have a critically acclaimed, hit prestige TV show based on one of their first-party properties, or an exclusive super hero franchise that's moved over 33 million units with a highly anticipated sequel on the way this fall, or an instantly recognizable character who's been burning up the sales charts since his PS2 debut and has an Amazon Prime TV show in the works.
There's more, of course, but I think that's already enough to show that you have no idea what you're talking about.
Yep. Microsoft and Activision are both wretched hives of scum and villainy, but if you didn't buy a load of ATVI in the $60s or low $70s in anticipation of this merger going through at $95, *you're* the one losing out.
Oh great, youâ(TM)ve just given us the plot for Terminator 7 â" the real Arnold Schwarzenegger has to be sent back in time to kill James Cameron before he can make the Terminator movies in the first place, to prevent the rise of an actual Skynet. (Echoes of Wes Cravenâ(TM)s New Nightmareâ¦)
Machine learning is, by its very nature, unreliable and ethically ignorant. It is the apotheosis of increasing laziness and impatience in software development circles, where people have literally thrown their hands up, tossed all the data at the machine, and said they donâ(TM)t care HOW a problem gets solved, as long as the machine can do SOMETHING to the data to make it pass a very limited set of tests. As long as it can step over that extremely low bar, it will be considered good enough. When edge cases are inevitably found, theyâ(TM)ll just get added to the test case set, the model will be retrained, and entirely new ways for the ML algorithm to fuck up will be created, continuing a never-ending game of whack-a-mole that can literally never perfect itself over any decently sized problem space.
In practice, this means that ML should never be the determining factor when the potential consequences of an incorrect decision are high, because incorrect decisions are GUARANTEED, and its decision making process is almost completely opaque. It shouldnâ(TM)t drive a car or fly a plane, control a military device, perform surgery, design potentially dangerous chemicals or viruses, or try to teach people important things. It shouldnâ(TM)t process crime scene evidence, decide court cases, or filter which resumes are considered for a job. Itâ(TM)s perfectly fine for low-stakes applications where no one gets hurt when it fucks up, but we all know it will get used for everything else anyway.
Imagine what happens once you have dozens of layers of ML bullshit making bad decisions with life-altering or life-ending consequences. Depending on how far we allow the rabbit hole to go down, that could very well lead to an apocalyptic result, and it could come from almost any angle. A auto-deployed water purification chemical with unintended side effects. Incorrect treatment for a new pandemic pathogen. Autonomous military devices going rogue. All things are possible with the institutionalization of artificial stupidity by people who donâ(TM)t understand its limitations.
Of course we should start regulating the shit out of this right now. And, of course, we will obviously NOT do that until the first hugely damaging ML fuckup happens.
Despite all wish-fulfillment reporting to the contrary, this deal is going through. The regulatory bodies have already left large holes for Microsoft to drive through to obtain approval, and Nintendo's willingness to sign a 10-year deal to keep Call Of Duty on their platforms is the nail in the coffin of Sony's counterargument.
Is this deal bad for gamers? Yes. Is it monopolistic? No. Is it anti-competitive? Sort of, but not outrageously so.
If you want to drown your sorrows, buy Activision stock in the $70s and enjoy your $95/share acquisition price when the deal inevitably closes. Then spend your $15+/share profit on games from other companies if it makes you feel better.
"You know what, Stuart? I like you. You're not like the other people, here, in the trailer park."
Functional languages are great for highly parallelized computation because well-behaved functions don't have side effects, and that eliminates entire classes of race condition and locking issues. Functional languages are also good for abstracting higher-order patterns, since you can manipulate data and other functions without having to care about their lower-level details. Specific functional languages like Scheme, where code and data share the same form, are great for creating custom syntactic structures and writing self-modifying code.
If you aren't doing any of those types of things, then functional programming may seem pointless to you, and that's OK. But every developer should be aware of the tools available to them in case the day comes when they might benefit from using them.
When there's a client involved and significant money, "cool" and "new" are only good if they actually help sell and maintain the software. The client doesn't care about [...] lambda expressions.
Lambda expressions are in fact cool, but anyone who thinks they're new must have been in a coma since the 1930s.
She sells cshs by the cshore.