Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Just let x86 die, please. (Score 1) 164

The thing that prevents Intel from approaching ARM levels of power consumption is transistor count. Transistors consume power. If your chip demands more active transistors to operate it will use more power.

ARM processor cores simply require far fewer transistors than an Intel core. The phenomenally complex Intel instruction set necessitates this, and cannot be avoided without removing backward compatibility. In order to reduce the active transistor counts complex designs can be used to aggressively disable parts of the core that are not in use - although this technique for power reduction is also used by ARM too.

Intel's only remaining option of making their chips power competitive with ARM is to use smaller manufacturing processes, since smaller transistors require less power. ARM-based competitors to Intel chips are often two, if not three, generations of manufacturing process behind. Obviously this is not a situation that Intel can count on remaining the same forever.

Comment Re:Google versus Apple (Score 1) 360

sorry for the lateness of my response.

you're right - with voice recognition, as compared to a conversational user interface, it is important to have the ability for the recognition system to accept feedback in some way to improve recognition. the voice recognition system built into the iPhone 4S does put a dotted blue highlight underneath words or phrases where it was not confident of the recognition. tapping that will produce a pop-up with alternative options - presumably just like Google's. that even happens inside Siri too.

my point however was that whilst this kind of recognition correction is fine for dictation, it's very poor for conversation. so whilst recognition in Siri can be corrected, it will just plough ahead and act on it's first recognition result rather than waiting for confirmation/correction. it tends to manage well enough with that first recognition, and this approach doesn't interrupt conversational flow. the fact that it's recognition wasn't technically perfect rarely affects results.

your comparison of Grafitti vs. the handwriting recogniser Microsoft used (which was produced by ParaGraph - the same cursive handwriting recogniser that been in Newton OS) is an interesting one. ParaGraph's recogniser, in the form that it appeared on the Newton, would provide multiple alternative recognition options for everything it recognised, reached by double-tapping on a word, and it would learn from corrections. the Newton comparison gets more interesting when one considers that Newton included an inbuilt Intelligent Assistant which was, in many ways, very similar to Siri (and in some ways more advanced). one interacted with the intelligent assistant via text, usually inputted via handwriting recognition. but the written input was recognised before the user chose to submit it to the assistant - granting an opportunity to correct the recogniser before being processed by the assistant. it was not really conversational in the way that Siri is.

Comment Re:Electricity (Score 1) 434

So your argument is that the US needs to get more innovative before it adjusts its patent system?

Has it not occurred to you that having such a fantastically stupidly rigid, overly broad, and inherently game-able patent system inhibits innovation?

Just about every new invention made in the US is already covered by patents. This makes it extremely difficult to bring new inventions to the market, especially for small new companies, since they can, and most likely will, be assaulted by lawyers wielding patents. The risks are greatly increased, and the potential rewards greatly decreased as a result of this. This is not an environment that encourages innovation - it's one that encourages playing it safe.

The incumbents with the power to change things (politicians, lawyers, lobbyists and corporations) don't really want the current system to fundamentally change since such change threatens their power-base.

Maintaining the status-quo, which inhibits innovation and over the long-term gradually decreases the competitiveness of the US, is of course short-term thinking. But for the most important incumbents wielding power, the politicians and the corporations, the short-term is the priority, since they need to win their next election, and post profits for the next quarter/year.

Comment Re:Google versus Apple (Score 4, Insightful) 360

The very best voice recognition systems are only about 95% accurate. That recognition system is the grey matter that sits between your ears. We tend to think of our recognition as perfect, but it's really not. We use context to help our recognition. We generally know what subject is being spoken about, we know what words are likely to come next, and we use that information to compensate when we fail to properly recognise words. All this happens so quickly that we don't notice that we have failed to recognise a word properly.

If human beings worked like computers and demanded 100% accuracy of recognition, we'd be continually stopping each other to repeat things. Conversation would be next to impossible. Even when we're not sure we've heard what somebody has said, we rarely ask people to repeat themselves, and usually just rely on having gotten the gist of what was said to us.

As Siri is a conversational interface it does not pop down a list of possibilities, since that would interrupt the flow of the conversation, but it instead makes use of context to help improve it's recognition. This isn't as simple as a trade-off of (per-word) accuracy vs sophistication of ability - it's a sophistication of ability that's attempting to improve the accuracy of the interface. It is not a voice recognition system per-se, it's a conversational interface, and they're not the same thing.

Comment Re:Irony (Score 1) 330

Apple's A5 (and previous A4) CPUs weren't off-the-shelf Samsung products. They were Apple-originated designs. (Well, using cores from ARM and Imagination Technologies, integrated by Apple, possibly with some help from other outside companies some of which are now owned by Apple.)

The CPUs that Apple used for earlier iPhones on the other hand were Samsung products.

Comment Re:Headline allusion error (Score 1) 330

Thing is though, these chips aren't "meant to be used in the USA" at all, but rather in China, and South Korea is much closer to China than the USA is.

China is where A5 chips get put into iPads and iPhones, not the USA. So chips get manufactured in the USA and then shipped to China, and from there get shipped to consumers the world over, not just the USA.

Comment Re:Query Languages (Score 3, Insightful) 421

Only in fantasy-land has SQL implemented "write once, run anywhere". You hint at this problem yourself where you say "with a few mods, it works on nearly any relational database".

Whilst there is an SQL standard, implementations of SQL vary massively. Professionally I've used SQL-Server, Oracle, MySQL, PostgreSQL and SQLite - all have profound differences, and moving anything beyond absolutely trivial SQL code from one to another requires rewriting the query. We're talking about a language here where major implementations don't even agree on string concatenation syntax...

Every SQL implementation has it's customisations and variations from the standard. It's almost impossible to write any kind of decent SQL code without making use of these custom variations, and thus ruining the portability of the SQL code.

Comment Re:Really? (Score 1) 463

Arguably what is being stolen in the case of 'piracy' is not the digital media itself but instead the money that should have been paid for the media.

The problem with digital media is that it's free to duplicate data, and thus there's no replication costs involved. If one looks only ever at the media there's clearly no deprivation of property involved. However that's not really what copyright is about.

Copyright is about granting and restricting the right to make copies of a work. The intent was to encourage authors to produce new works, by helping to ensure they'd get paid for the works they had produced. The laws were introduced because it was felt that it was too easy to make copies, even though at that time the cost to make copies was high.

The fact that copying is easier and cheaper now than ever doesn't significantly change the arguments involved for or against the existence of copyright laws. (There remain significant up-front costs involved in producing media, and these costs need to be recouped.) It can make a powerful argument in favour of lower digital media prices.

Obviously the current duration of copyrights are ridiculous. Any duration exceeding half of the average expected lifetime of the author is clearly patently absurd. This is why the original laws when formed were for 7 years with an optional 7 year extension - durations that I'm inclined to think should still be perfectly acceptable today.

Quite what effect the fact that making copies is free should have on the duration of copyrights is another question. Copyright durations were already stupidly excessively long before digital media arrived.

Comment Re:It's The Standards, Stupid (Score 1) 263

My experience is that DLNA and AirPlay do two very different things.

I have equipment that does DLNA. My TV does it, my desktop computer, my PS3. From my TV, I can browse the DLNA server running on my desktop computer, look through folders and files, and pick something to play - although it usually won't play because the codec/container support in my TV is lousy. Playing things on my PS3 is usually more successful.

I also have equipment that does AirPlay. On my iPhone I can choose to send the audio or video I'm playing to my AppleTV. I can also choose to send audio from iTunes on my PC to a set of remote speakers connected to my amp.

None of my DLNA equipment lets me play out media to a remote device. That may or may not be a feature in the DLNA specifications, but it's certainly not supported in the equipment I have. DLNA, in practice with the equipment I have, is a request-based client-server architecture. AirPlay is essentially push-based.

There is a fundamental difference between what DLNA and AirPlay are doing.

Comment Re:Let this be a lesson (Score 5, Insightful) 312

Value your health. Value your safety.

Accomplish something while you still can, just as Ilya did.

Value your mental health.

Working flat-out at all costs to accomplish something can be extremely detrimental to both your physical and mental health. The line between sane and insane is much narrower than many imagine. Whilst you may write some cool code, what use is that if you end up losing your sanity, or worse your life?

Comment Re:Reverse engineered Alpha?? (Score 2) 185

Of course what's daft about this is that there seems to be no evidence that the ShenWei SW-3 is a Loongson/Godson chip. There is nothing to be found on what the instruction set of the CPU is, and no evidence that it implements the MIPS instruction set - any googling for that only brings you back to this story.

There is some speculation that the ShenWei's CPUs were "inspired" by the DEC Alpha. Quite what that means is anyone's guess. Again, there is very little evidence to be found on this subject, just a blog posting or two.

This seems like a case of lousy, biased, reporting.

Comment Reverse engineered Alpha?? (Score 5, Interesting) 185

Why would Loongson/Godson be reverse engineered from a DEC Alpha? It implements the MIPS instruction set, not Alpha. Wouldn't it have been easier for them to reverse-engineer a MIPS chip? Doesn't the evidence seem to indicate that it's a genuinely independent implementation of MIPS?

The only source of this speculation I have found is just the extremetech article that has been linked to. My googling is showing nothing else to back this up.

Comment Re:Comment by S. LeBeau Kpadenou (Score 1) 1452

Whilst it's almost certainly true that there are more devices running Linux than any Apple OS, I'm not really sure whether you can say that the success of Linux is because of Stallman, or in spite of him.

The GNU Project started in 1984 with the stated intention of producing a free operating system. For the kernel part of that project, Hurd, progress has been very slow. GNU/Hurd is the operating system that Stallman set out to build at the GNU project. Even today Hurd remains far from viable for production use.

The Linux kernel is not a product of the GNU Project, it just (eventually) adopted the GPL and remains independent of the GNU Project. Free software and open source had existed before the GNU project began. Without Stallman and GNU, Linus would likely have still released Linux as open source.

Let's face it, GNU is nothing without Linux, and vice-versa...

As for the rest of the laundry list, Safari's engine, WebKit, did indeed start as a branch of KDE's KHTML but KDE is not GNU (although they do use the LGPL as one of their licenses). The Mac's user interface is not, and never has been, descended from X (again, X is not GNU, and does not use a GNU-originated license). Also whilst Android does use some GNU components, the most important parts of the system, i.e. the Linux kernel and the Dalvik VM, aren't of GNU origin (Dalvik uses an Apache license).

Slashdot Top Deals

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...