Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:CDMA won the GSM vs CDMA standards war (Score 4, Informative) 84

I hardly know where to begin. You're confusing standards with modulation techniques. You're also confusing GSM the standard (1991, TDMA, voice with GPRS and then EDGE) with GSM the "class" (which includes UMTS, HS(D)PA(+), and LTE). The latter is a set of standards defined by the 3GPP, whose scope now includes the maintenance of the original GSM standards.

CDMA is a modulation technique (actually a "channel access method", basically a way to share the medium vs an actual encoding). Other modulation techniques are AM, FM, QAM, CODFM, and OFDMA (OFDMA is one channel-access version of OFDM - 802.11G uses OFDM with CSMA/CA instead). There's a "class" of standards built on IS-95 (you may remember it as cdmaOne) that includes CDMA2000, 1xRTT, and EV-DO. These did pioneer the use of CDMA for cellphones, but everything uses CDMA nowadays, and GSM (the lineage) has used CDMA (W-CDMA) since UMTS.

The point is, in non-RF cellphone usage, the antonym to CDMA is not TDMA, but GSM. And GSM the lineage has very much won the standards war with LTE. Over 90% of devices in the world use GSM-lineage standards, including most Verizon and Sprint devices (which are right at home on LTE). Eventually the legacy IS-95 derived standards will be completely turned off and the US will have gotten over its weird not-world-standard fetish, at least for cellphones.

Comment Re:But Apple has made life better for you (Score 1) 311

Er, but they could've done that already. Lightning audio output already works, doesn't it?

Yes, it does, since iOS 5.1.1: (and many others)

Anyways, Apple's built-in DACs are widely known for being better than almost anyone else's. I'm not an audiophile, and I never had to worry about whether the random headphones or stereo system or speakers I had on hand had a quality DAC, but now I do - and it'll cost more to boot (especially for a mediocre one, let alone one as good as the one Apple used to have).

Comment Re:Or the actual reason(s) (Score 5, Insightful) 761

I don't doubt that those are the actual reasons, but that's not really the point. All it means is that they're pushing off their (engineering) problems on their users. Apple has a long history of deprecating stuff that (at the time) people thought was premature - but in essentially all the other cases it turns out that the new thing really is better. Serial ports, the floppy drive, non-USB connectors, CD drives in laptops, even replaceable batteries - there are tangible benefits to switching to the new thing, and they usually relate to speed, capacity, or physical size.

The headphone jack is slightly thicker than a Lightning connector (the only remaining jack) - but they didn't make the phone thinner to take advantage of the extra depth. And other than the connector itself a Lightning headphone is worse in every way, because headphones are driven by your ear technology, not the phone's. The newest fanciest Lightning headphones in 5 years (assuming this decision sticks) will never be more than today's headphones plus a built-in Lightning dongle.

What does this decision get me as a user? Let's go through. Headphones are headphones; there's two channels of audio that are the result of a varying electrical signal. I don't really care what the cord to the device looks like and considerations like "do these phones work with other things? do other phones work with this?" easily dominate that area. I guess this lets them use a little extra power but there was already more than enough output to damage your ears. If there were wild battery life improvements... maybe? But someone on the other thread did the math and a headphone jack's volume of battery is good for ~12 minutes. Meh. What about water resistance? Other phones have no problems with the IP67 rating and a headphone jack - I have no doubt that it was easier for Apple's engineers, but Apple used to not push their problems on their users.

So what does that leave? They wouldn't be able to have a force-sensitive home button? Honestly I'd rather have the headphone jack. Or just get rid of the home button - it works just fine for Android - or at least make it oblate or rectangular rather than round.

I have had every non-S model iPhone since the 3G, so I'm "due" to buy this one. In addition I have apps that I rely on that only work on iOS. It should be a slam dunk. But... honestly? I knew someday I'd lose the reason to buy an iPhone, and this might be that day. Not just the headphone jack, but the whole package. It doesn't look like a bad phone as such, but the only thing I'm really interested in is the waterproofing. And I'm not careless enough with my phone that getting it ruined is a big risk. The headphone jack thing isn't a dealbreaker, mostly because I don't listen to music much on my phone, but it's damn close.

Honestly Apple is just out of ideas. I bought a new MBP last year and it was the first hardware purchase I made in my entire life that I wasn't excited about. Roughly as functional as the 5-year-old one it replaced, more in some ways and less in others, but the same price. I needed a new one because the older one wasn't really working but boy did they manage to turn something I used to enjoy into something kind of boring and depressing. I'm still annoyed about the large size of the smallest iPhone still available - I was in London a few months ago and had to use my (out of contract and unlocked) iPhone 5, and it was sooooo nice. I assumed I'd gotten used to the wider width, but nope - and I didn't miss the extra screen at all.

Comment Meh, too easy. (Score 1) 45

Now this is cool:

Guy hacked a 1987 arcade game by coding up another Z80 "processor" on an ATMega to share bus-mastering duties with the other two already there, in order to periodically mess with the RAM for the purpose of saving/restoring high scores and tweeting. He made a board that just plugs in between the CPU and the board and gives total Ethernet-ready control. It's easily adaptable to other machines, too.

Rest of the guy's site is neat too, like his hard disk controller hack that lets you root a machine by faking the cache read for /etc/passwd and is triggered by writing to a special file.

Comment Re:They don't make disasters like they used to (Score 1) 675

Many large chains will read that info to match you against their databases for marketing purposes just like they do for magstripes (there was never any reason to keep track of any card info).

Do you have a citation for this? I'm pretty sure it's specifically disallowed, which is why all the big stores have rewards programs (because that's the only way they can track you). I can't find any evidence one way or the other.

Comment Re:What's the big problem? (Score 2, Insightful) 675

This is an interesting point. The signature in the US isn't considered an authenticator, it's actually considered agreeing to a contract. If you look at your receipt it probably says "I agree to pay the above amount according to the terms of the cardholder agreement" or something. The idea is (in theory) they could take you to court and say "but you signed a contract saying you'd pay!". If they have someone other than the cardholder in court over that transaction, it's not because of a broken contract - it's fraud.

In Europe, it is considered to be an authenticator, which really slows things down. They do check the signature vs the one on the card. I guess chip-and-signature at least means that someone can't clone your card and use their signature, at least not trivially. They'd have to get your card and then match whatever was on the card, or erase the signature somehow.

Comment Re:Oh please. (Score 1) 675

That is a fair complaint. It's because the chip on the card actually has to know how much the bill will be before it generates a one-time authorization code for that specific amount. Presumably with the magstripe the terminal could let you enter everything, then only at the end talk to the network. Though come to think of it there's no reason you couldn't do that with the chip, just have all the "user interaction" stuff take place during scanning, then leave the card in until the total is rung up. I guess that's either specifically disallowed by the networks, or the manufacturers/stores just figure it would freak people out to leave their card in for a few minutes.

Comment Oh please. (Score 1) 675

It's really not that bad. It takes exactly the same amount of time, the only difference is it feels longer because you have to leave your card in while it authorizes. But there's no extra round-trips or computation or anything - the card gets challenged with the amount, and it generates a one-time code for that amount that gets sent instead of (or alongside?) the card number. For the annoyance of leaving your card in the reader, skimming becomes impossible. I've had my debit card skimmed, which was annoying enough because I was a college student with no money, but then the bank screwed it up and I had to escalate with them to fix it. No more skimming is A-OK with me.

It must be exhausting to be the author. Going around all day, finding - at best - minor inconveniences to be annoyed about. Not to mention that they clearly didn't go into the article with any kind of an open mind and just found stuff to complain about. No nuance at all. I can't find one valid complaint in the whole article that's not "the software isn't 100% yet" (...sure?) and "some merchants will need new equipment eventually" (it's called a cost of doing business?). And this gets the "utter disaster" label?

The only disaster is that they insisted on chip-and-signature instead of chip-and-PIN. Not only is it less internationally compatible, but it's less secure - not that PINs are secure, but it means the restaurant can't take your card, they have to bring a reader to the table. I'm still mad about that choice, but it's typical USA, right? Here's this international standard we'll implement like 80% of the way. At least chip-and-sign cards still work in most automated machines in Europe, so it's a small improvement, but I die of embarrassment a little every time they have to call the manager over to interpret this weird new "make them sign the receipt" display and find a pen. Unfortunately the author doesn't even focus on this, other than "but the FBI said to use chip-and-PIN and they didn't do it!" line.

Comment Re:"Business People" (Score 1) 192

This is the Keynesian beauty contest:

It is not a case of choosing those [faces] that, to the best of one's judgment, are really the prettiest, nor even those that average opinion genuinely thinks the prettiest. We have reached the third degree where we devote our intelligences to anticipating what average opinion expects the average opinion to be. And there are some, I believe, who practice the fourth, fifth and higher degrees.

Comment Re:Latency vs.Bandwidth (Score 4, Informative) 73

It depends what you mean by fastest. As you note we have a perfectly good word for "the time it takes for a bit to make it out the other end" - latency. Most people probably intuitively associate bandwidth with speed, though, because it's most directly relevant to what they do, which is try to transfer quantities of data. If it takes 1 minute to download a movie on one connection and 10 on another, but both are identical latency, most people will say the former is 10 times faster - because it is, for what they use it for. A gamer who has specific needs might prefer a lower-bandwidth but lower-latency (or jitter) connection, but probably wouldn't call it faster - they'd say it was lower latency because they know most people associate speed with bandwidth. Your dump truck wouldn't be called the fastest, but if the typical person had a mountain of soil they wanted moved and called up the earth-moving companies to give them a bid, the one with the biggest trucks would probably be able to bid the shortest time.

Of course, if it's a more direct routing, it may indeed be the lowest-latency link between those two points.

Slashdot Top Deals

As in certain cults it is possible to kill a process if you know its true name. -- Ken Thompson and Dennis M. Ritchie