Slashdot videos: Now with more Slashdot!
The article is full of shit.
It claims that Gates's blog post here here supports LENR, but it does no such thing (although some people in the comments section do mention it).
and found some parts very confusing. E.g. in Fig. 1a, sulfur hydride seems to have critical temperature around 70K at 177GPa, and in Fig. 1b, it seems to have critical temperature of 185K at the same pressure. And the "measurements" in Fig. 4 don't look like measurements, they look like data generated using a mathematical function. Dan
Could you comment on some of the claims in the abstract?
1. Deep learning is a broad set of techniques that uses multiple layers of representation...
Agreed- that's what "deep" implies.
Is multi-scale analysis a primary component of 'deep learning'?
This may be true in vision, but not in general (e.g. in linguistics tasks and in speech, there is usually not a natural notion of scale).
2. "relatively little is understood theoretically about why these techniques are so successful at feature learning and compression.
True... deep learning methods are not very easy to analyze (personally I am skeptical that there is much point in trying very hard to analyze them).
"We construct an exact mapping from the variational renormalization group..." Is this not new, not correct, or is this simply not of much use to deep learning?
I think the closest is to say it's not of much use. I didn't read the paper super carefully (and I'm not a physicist so am not familiar with the renormalization group), but I imagine the analogy is not very close at all and only applies in specific cases, e.g. in convolutional nets or something like that.
The renormalization group theory is so general and powerful, it's had profound impacts on many areas of theoretical and mathematical physics. Do you think this can't or won't impact the field of deep learning? If deep learning has multi-scale analysis at its heart, it appears on the surface that RG should be a good treatment. Have there been attempts to use RG for deep learning aside from the present work?
If the connection is real, it would seem to suggest that perhaps deep learning may have something to offer physics, if it really is "employing a generalized RG-like scheme." Do you have any comment on this?
I haven't read the paper in detail but I just don't think it's plausible that there is a very interesting connection as they are such different things.
To pick a random example, imagine you are a botanist and someone told you there is a connection between hydroelectric dams and oranges. Even if there is a connection, it's probably not something that is going to help you very much, and you probably wouldn't be so excited to read the paper explaining the purported connection.
This article is way overblown. This is not the kind of paper that is likely to attract significant attention in the deep learning community. And the person who they got to say it was important, Ilya Nemenman, is not someone I have heard of.
Move along. Nothing to see here.
In her honky-tonk lagoon
Where I can watch her waltz for free
’Neath her Panamanian moon
An’ I say, “Aw come on now
You must know about my debutante”
An’ she says, “Your debutante just knows what you need
But I know what you want”
Read more: http://www.bobdylan.com/us/son...
1) By the time you learn it, it won't be hot anymore.
2) It's all about experience. Don't take my word for it, look at the job ads. Learn something all you want, if you don't have five years experience in it, your knowledge is useless.
3) These articles about what's "hot" are just standard corporate propaganda. IT employers always want people chasing their tails, studying everything, just so they have a larger labor pool.
4) Don't get constantly distracted trying to learn what is supposedly "hot" at the moment, just learn anything useful, and be very good at it. Being very good at anything useful is far more valuable than a superficial knowledge of the latest fad.
5) These articles don't tell you anything more than they tell everybody else in the world. Learning whatever is not going to give you any competitive advantage.
All JMHO, of course.
Disclosure: I worked in IT for over 30 years. I have held several jobs, at several companies. I have been through the hiring process a lot.
Your statement is not relevant here because Matlab is not hot. scipy/numpy are hot, Matlab is old. I work as a research scientist in a top university doing heavy data-processing (however I personally maintain my own C++ wrapper of BLAS so I don't rely on other people's wrapper code).
I've never attempted to use Ubuntu since.
Microsoft is a multinational corporation - albeit one that started in the US - and they have the perfect right to locate their operations wherever suits them. Immigration policies are a valid reason to make these decisions. I don't understand why American think it's their automatic right for all activities of companies like this to be located on their soil.
It's true, my comments envisaged conventional farming, not the methods the Israelis use with poly-tunnels. So in the long run, your point is true.