Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Universal believes that Spotify is directly hurting sales at stores like iTunes.
Universal's belief is most certainly correct to some extent, but is that a bad thing? True fans, I think, would find other ways of supporting the artists they love, and I'd guess the ones who do nothing but stream wouldn't have spent more money on it in the first place.
Streaming over the internet is okay, but it's SO dependent on your connection quality (and your bandwidth limits). It can work, though, obviously.
Maybe it'll work in the future, but it's a pretty poor experience right now.
I have the original NVIDIA Shield, the one that looks like a 360 controller with a screen strapped to the top. Late last year they announced a free trial for their GRID cloud gaming service. One caveat was that their servers were all in San Jose, and if you're too far it warns you. I tried it from my home in Illinois, and it was predictably horrible with just a ~70ms ping. I tried it again from California and it was only slightly less horrible with a ~20ms ping.
Driving games become drunk-driving games. Another driver comes in and hits you? Good luck recovering. Forget that there's a turn at some point in the track? You'll never react to it in time. Things that require constant micro-adjustments like drifting are virtually impossible.
Fighting games become button-mashers because you can't react fast enough to block or counter-attack.
Seriously, these were launch titles! I assume 99% of testing happened with local-network latency. If I were the guy at NVIDIA who okayed go-live, I'd be deeply embarrassed.
The only thing I'd use it for right now might be a turn-based strategy games, or other things where latency really has no effect on gameplay.
I'm sure many devs have had jobs where they're working on some sort of killer automation. Something that makes them look out into a sea of office workers thinking "by end of year, we'll only need half of you..."
They're jobs that technology has long since claimed, yet they still exist. Nothing's perfect. It'll be a slow road.
This makes sense for a couple reasons.
First, abusing goto really serves noone. It doesn't make code quicker to write. It certainly doesn't make it easier to understand. There is no benefit to it.
Second, I'd argue that very few people want to write new code in C these days. Those who do have specific reasons for it and are probably a bit more experienced or passionate and thus aren't the kinds of people who'd readily abuse things. The ones who would are going to be mostly attracted to easier high-level languages that don't allow the abuse in the first place.
Do you happen to have any reference numbers or links so I can argue with the dealer mechanics about getting the update?
The easiest way to get the ECU update is the Idle dip TSB, which you're likely also experiencing. This'll update you to version B01, which includes all prior fixes. Print it out and bring it with you.
It does have some advantages. I got the Scion FR-S the day it came out. The original firmware had a number of small issues and one very serious one.
At a specific load and intake volume, the car wouldn't push enough fuel. It ended up being dangerously lean and it was found that those who stayed at that point for too long would have a catastrophic failure from their direct injector seals melting, necessitating a full block replacement.
An ECU update came out a while later that fixed it, but nobody was notified. Cars coming in for service don't get it automatically -- the techs aren't even told about it. 99% of those original cars remain unupdated. Anyone who chooses some "spirited" driving on a hot day is at risk.
An OTA update would solve issues like this really smoothly for a lot of people. I'm all for it.
The latest generation of CPUs have instructions to support transactional memory.
Near future CPUs will have a SIMD instruction set taken right out of GPUs where you can conditionally execute without branching.
Makes me wonder if any other astronomers or other scientists to discover celestial objects will have their ashes sent in homage...
It's a romantic notion, but strikes me as not really in the spirit of science. If I knew someone was going to explore this awesome thing I discovered, I would much rather have them use every bit of available weight to further that discovery.
Extracting a character - trivial. Length of string - trivial.
I don't think it's quite as simple as you think. UTF-8 is a variable-length encoding, but UTF-32 is too when you consider grapheme clusters.
When you extract characters and and determine length, are you only talking about code points (not very useful) or are you taking into consideration combining characters to account for actual visible glyphs that most people would consider to be a character?
The overwhelming majority of apps are only doing trivial operations -- string concatenation and shuffling bits to some API to display text. For these apps, choice of encoding really does not matter. NetHack is very likely in this category.
Anything more and you'll have to deal with variable-length data for both UTF-8 and UTF-32. So it doesn't really matter. Choose whichever uses less storage space.
For which implimentation of UTF to use, I'd go with utf8 as it seems to have the widest adoption, or 32 because that will probably allow you the longest time before having to think about this again. I would avoid the middle ground.
UTF-8, while originally only defined to 31 bits and now defined to 21 bits, actually has room to trivially extend up to 43 bits. One could say it's more future-proof than UTF-32. Not that it really matters -- we're only using 17 bits right now so I doubt we'll ever get past 21. Maybe when we encounter intelligent alien life.