Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Bad Site (Score 1) 252

So you can share every movement and temperature fluctuation with your friends on panterest, obviously!

I bought the "Bobby Flay filter" through in-pan purchase, so it always looks like I'm doing something with steak and blue corn and ancho chiles, even when it's really just mac & cheese from a box.

Comment Re:Linus is right (Score 1) 449

The need for massive parallelism will come (already has in the lab) from future applications generally in the area of machine learning/intelligence.

Saying that "single threaded loads" won't benefit from parallelism is a tautology and anyways irrelevant to Linus's claim.

FWIW I'd challenge you to come up with more than one or two applications that are compute bound and too slow on existing hardware that could NOT be rewritten to take advantage of some degree of parallelism.

Comment Re:Let's see how that sounds in 5-10 years time .. (Score 1) 449

Well, there's obviously no need to add more cores/parallelism until there's a widespread need for it (unless you are Chinese, when octocore is a must!), but I think the need is coming pretty fast.

There are all sorts of cool and useful things you can do with high quality speech, image, etc recognition, natural language processing and AI, and these areas are currently making rapid advances in the lab and slowly starting to trickle out into consumer devices (e.g. speech and natural language support both in iOS and Android).

What is fairly new is that in the lab state of the art results in many of these fields are now coming from deep learning / recurrent neural net architectures rather than traditional approaches (e.g. MFCC + HMM for speech recognition) and these require massive parallelism and compute power. These technologies will continue to migrate to consumer devices as they mature and as the compute requirements become achievable...

Smart devices (eventually *really* smart) are coming, and the process has already started.

Comment Re:Let's see how that sounds in 5-10 years time .. (Score 1) 449

The trouble is that extrapolating the present isn't a great way to predict the future!

If computers were never required to do anything much different than they do right now then of course the processing/memory requirements won't change either.

But... of course things are changing, and one change that has been a long time coming but is finally hitting consumer devices are the hard "fuzzy" problems like speech recognition, image/object recognition, natural language processing, artificial intelligence... and the computing needs of these types of application are way different than running traditional software. We may start with accelarators for state-of-the-art offline speech recognition, but in time (a few decades) I expect we'll have pretty sophisticated AI (think smart assistant) functionality widely available that may shake up hardware requirements more significantly.

Comment Re:Linus is right (Score 1) 449

Yeah, parallel computing is mostly hard the way most of us are trying to do it today, but advances will be driven by need, and advised by past failures, not limited by them.

You also argue against yourself by pointing out that CPU's have hit a speed limit - this is of course precisely why the only way to increase processing power is to use parallelism, and provides added incentive to find ways to make use of parallel hardware easier.

The way massively parallel hardware will be used in the future should be obvious... we'll have domain specific high level libraries that will encapsulate the complexity, just as we do in any other area (and as we do for massively parallel graphics today). Massive parallelism is mostly about SIMD where the programmer basically wants to provide the data ("D") and high level instructruction ("I") and have a high level library take on the donkey work of implementing it on a given platform.

Current parallel computing approaches such as OpenCL, OpenMP, CUDA are all just tools to be used by the library writers or those (which will become increasingly few) whose needs are not met by off-the-shelf high level building blocks. No doubt the tools will get better, but for most programmers it makes no difference as they use libraries rather than write them. Compare for example to all the advances in templates and generic programming in C++11 and later... how many C++ programmers are intimately familiar and proficient in these new facilities, and how many actually need to use them as opposed to enjoying the user-friendly facilities of the STL built atop them?!

Comment Let's see how that sounds in 5-10 years time ... (Score 1) 449

It sounds rather than Bill Gates' [supposed] "64KB is enough for anyone", but no denying that Linus said this one!

Saying that graphics is the only client side app that can utilize large scale parallelism is short sighted bunk, and even ignores what is going on today let alone the future. In 20 years time we'll have handheld devices that would look just as much like science fiction, if available today, as today's devices would have looked 20 years ago.

I have no doubt whatsoever that in the next few decades we'll see human level AI in handheld devices as well as server-based apps, and you better believe that the computing demands (both processing and memory) will be massive. Even today we're starting to see impressive advances in speech and image recognition and the underlying technology is increasingly becoming (massively parallel) connectionist deep learning architectures, not your grandfather's (or Linus's) traditional approaches. Current deep-learning architectures can be optimized to use significantly less resources for recognition-only deployment vs learning, but no doubt we'll see live learning in the future too as AI advances and technology develops.

Linus's relegation of parallelism to server side is equally if not more shortsighted than his lack of vision of client-side CPU-sucking applications! If you want systems that are always available, responsive and scalable then that calls for distributed (client side) implementation, not server based. Future devices are not only going to be smart but the smarts are going to be local. Bye-bye server based Siri.

Comment Re:No group "owns" any day on the calendar. (Score 4, Informative) 681

Close, but no banana.

The Dec 25th date was co-opted from the Roman holiday/feast of Natalis Invictus (= birth of the sun-god Sol Invictus), the date being chosen as it was then (re: procession of the equinoxes) the winter solstice when the days start to get longer again (i.e the sun is reborn). This holiday was created by the Roman emporor Aurelian in the 3rd century AD, and was co-opted by the Christians maybe a 100 years later.

Saturnalia was a separate - very popular - Roman holiday in (if memory serves) November/December, which FWIW had a present giving component.

However, the gross external form of modern Christmas - Tree, Holly, Mistletoe (i.e. general greenery) and Yule log all come from a different, northern European, winter solstice celebration called "Yule".

So, the Xmas feast/date comes from Natalis Invictus, the Tree/Holly/ etc from Yule, the presents *perhaps* from Saturnalia, and we'll have to concede the nativity (there's that "natalis" again) to the Christians, who prior to 300AD would never have celebrated Jesus' birth!

Comment Re:It's not him.. (Score 2) 681

He didn't fold (where's the back down?). He blatently and successfully trolled the Christian fundamentalists**, and his follow-up was little more than a gloat.

** and/or anyone ignorant enough of history to think that Jesus was born on 25th Dec and/or was the basis of the Dec 25th holiday we now call "Christmas"

Comment C++ getting better and better... (Score 1) 641

It seems C++11 was just finalized yesterday, but already we now have C++14 finalized and C++17 in the works....

This is hardly the same language from a few years ago - the power and ease of use that has been added, both for library and application developers is amazing.

Anyone programming in C++ who isn't thoroughly familiar with all the new stuff in C++11 is missing out tremendously...

Comment CyberThis, CyberThat, CyberCommand (Score 5, Insightful) 61

Dear US military and federal contracting wanker-sphere,
I know you were 30 years late discovering this whole internet thing, so imagery and phrases from 1980s cyberpunk still sound super-duper-cutting-edge to you, but can you please stop using "cyber" as a catch-all for everything connected to computers? Thanks.

PS: When you leave a laptop full of citizen's private information on the bus, and a million people's social security numbers turn up on pastebin the next day, that's called "negligence" not "a cyberattack".

Comment Punctuated equilbrium (Score 1) 282

evolution = variation + selection

What's happening here is likely more about selection then variation, although maybe a bit of both. I suspect this is largely the mechanics of punctuated equilibrium at work.

The way evolution is taught at high school level is typically over simplified to the point of being wrong, as indeed are many subjects. Evolution is NOT a continuous process of each generation getting better fitted to the environment via the process of natural selection acting on genetic changes introduced in individuals in that generation...

The normal way that evolution is understood to play out in practice is via "punctuated equilibrium" whereby genetic changes - which are typically too small and/or irrelevant to have any immediate impact on fitness - accumulate in animal populations over many generations. It's not the genetics of individuals that are changing so much as the genetics of the interbreeding population as a whole as accumulated changes get spread throughout the population over a number of generations. This is the "equlibrium" phase whereby genetic changes are accumulating but there is no external evidence of this as the changes are irrelevant to fitness.

What happens next is the "punctuated" part of "punctuated equilibrium" - something changes in the external environment that the animals are part of - in this case the arrival of an invasive species. These changes in the environment (drought, disease, invasive species, etc, etc) can happen very quickly compared to the speed at which genetic change accumulates. Now, it may happen that in the new changed environment some of the accumulated genetic changes that were previously benign now become a factor in fitness (either positively or negatively) and therefore a "sudden" change in the population may be seen as those individuals possessing what has now become a helpful trait, or not posessing a negative trait, prosper relative to their peers and rapidly come to dominate the population.

When a change in the environment brings about a quick change in an animal species, it is tempting - but sloppy - to say they are rapidly evolving. What happened rapidly was the change in the environment, not the slow process of genetic change that suddenly became significant.

In this case the Florida lizard population presumably already had all the traits - to some degree - that would prove positive or negative when the invading Cuban species arrived, and a quick change was seen as natural selection did it's thing and over a few generations the population became dominated by individuals having the (slowly come by) traits that now proved to be critical.

Of course there's more to how the dynamics of evolution play out than just puncuated equilibrium... While it's always going to take a long time for any complex feature such as sticky toes (or toes themselves for that matter) to evolve, the way genetic coding works is such that it may be very easy for a feature - once it exists - to be modified by a small change (e,g. a birth defect giving you unwanted extra limbs or extra sticky toes - advantageous if you mostly climb slippery trees, disadvantageous if you don't).

So.. the big picture here is that the Florida lizard species will have already accumulated the feature set that proved advantageous (or disadvantagous for those that dies giving way to the "new" variety), and this just played out once the environment changed. Subsequent to the changed environment additional variation/selection (which you could think of as optimization "tweaking") of the most critical features (toe pad size, scale stickyness) may have occurred.

Comment Re:The good news (Score 1) 700

That only worked because the people harmed by having their satellite cards bricked were willfully infringing DirecTV's copyrights, and suing DTV for frying their smartcards would be admitting it in court. At absolute best the pirates might get triple actual damages, but 3x the cost of a smartcard is next to nothing, and then the counter-suits would have been a slam dunk for DTV to win $750,000 statutory damages from each of them.

If FTDI wants to use that strategy they're going to have to contend that every end-user of a device with a counterfeit FTDI chip knew it was fake. Doesn't sound plausible to me, but the US courts are generally tech-idiotic so I suppose it's not entirely impossible.

Slashdot Top Deals

"Truth never comes into the world but like a bastard, to the ignominy of him that brought her birth." -- Milton

Working...