Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Politics? (Score 4, Insightful) 106

[...] and rather than cutting the least important program, they cut the most visible program, in an attempt to get their funding restored.

Honestly, though, a qestion-answer service for school children probably does rank among the least important programs for a research lab. I very much doubt this is part of their written remit (as opposed to communicate their actual research to the public), and the people spending time at work answering the questions certainly get zero professional recognition for it.

It does sound like a very nice, fun service. And I do agree that this kind of outreach is important. But if this is not part of what their funders want them to do, then it should come as no surprise if it's among the first things to go when money becomes tight.

You want this kind of thing to continue? Make sure there's funding (and paid time) earmarked for doing it. In fact, that may be a good idea in general: add a small fraction (.1% or even less) to any research grant over a certain size for general science outreach. If it's part of your funding, that also removes the career obstacles toward doing outreach we too often have now.

Comment Re:Strategy games? (Score 2) 148

It's not even a good example of image recognition, because the images to be processed don't have to be "understood" to be used. On top of that, the graphics of the games in question were very simple and primitive compared to what image recognition software deals with.

Add to that the repetitive nature of old video games that were based on 99% reaction time and 1% strategy, and you can just flat out colour me "unimpressed" with this "research".

Back in University, my AI project was a game player (a simple strategy game whose name I forget.) As it turned out, the entire game mapped down to a pre-determined set of decisions, so after playing only a dozen games, the "AI" would win every time, and that was just with a simple weighted-algorithm system of play. Some problems are just eminently suited to "AI" is all that I ended up learning from that project, but it was a useful lesson on the difference between optimizing a decision tree and actual "intelligence".

Until someone comes up with a system that can deal with bad and erroneous inputs as well as humans, I will continue to be unimpressed. Yet at the same time, I don't consider it necessary for a computer to be able to think and understand per se to be considered an "intelligence." It just needs to be able to make decisions and choose between alternatives faster than it's human counterparts in order to be useful, and to reduce the number of errors compared to it's human counterparts.

I have little faith in "neural networks." They place too much emphasis on emulating simple biological components and not enough on the "art" of understanding. Neural networks basically take the approach that "if it's big enough, we'll maybe get lucky and it will start to think." That's not "solving a problem." That's "playing the lottery."

Comment Re:Operating at 20W gives zero improvement. (Score 1) 114

AMD was (before haswel) not too bad if you have a multithreaded workload.

Thanks to the XboxONE with 8 cores games will run better on AMD as they will become more threaded since many are crappy xbox ports.

For a cheap box to run VM images in virtualbox/vmware workstation, video editing, or compiling code AMD offers a great value and the bios does not cripple virtualization extensions unlike the cheap ones from intel.

FYI I switched to an i7 4770k for my current system so I am not an AMD fanboy. But I paid through the nose for hyperthreading and 4 cores. I wanted good IPC for single threaded as well.

AMD dropped the ball twice. Both with abandoning the superior 5 year old phenom II which is still 25% faster per clock ticket than their newest system??? Second selling their fabrication plants to raise the shareprice last decade. They have .28 nm chips while intel is busy switching to .10nm??! How can you compete agains't that? Worse global foundaries are more interested in ARM chips as AMD has too low demand. OUCH.

Even if you love Intel it is in your best interest for AMD to stick around for competition and lower prices.

Comment Re:Operating at 20W gives zero improvement. (Score 1) 114

In SWTOR I got a doubling of FPS from moving from a PhenomII black edition to an i7 4770k.

I would be surprised if that were not the case. The i7-4770k came out 5 years after the Phenom II - a lot happened in that time, including the entire Phenom line being discontinued and succeeded by newer architectures. I'd be more interested in a comparison between the i7-4770k and its 30%-cheaper contemporary, the FX-9590 (naturally, expecting the i7-4770k still to win to some degree if we focus purely on single-thread performance, but is that worth it? Once SWTOR is no-longer CPU-bound you wouldn't see any difference between the two at all).

Here is the quicker. The half decade old phenom II is faster per clock cycle than the FX series based on the bulldozer architecture?? AMD really messed up as it was optimized for its graphics hoping it would win this way. In other words those mocking it call it Pentium IV 2.0

Comment Re:Operating at 20W gives zero improvement. (Score 2, Informative) 114

Do you have a link for that? It's not that I disbelieve you, I strongly suspect that Intel would do that crap. I'd like to read more about it however if you hae a link handy, then stash the link for the next time this benchmark comes up.

Personally, I like the Phoronix Linux benchmarks. They're more meaningful for me since I use Linux and they're all based on GCC which is trustworthy.

http://www.phoronix.com/scan.p...

The i7 4770 ocasionally blows away the FX8350 by a factor of 2, but in many benchmarks they're close, and Intel loses a fair fraction. The 4770 is the best overall performer, but not by all that much. It seems that the choice of CPU is fairly workload dependent.

For servers, I still prefer the supermicro 4s opteron boxes. 64 cores, 512G RAM, 1U. Nice.

The i7 4770k is a fairly high end chip by Intel. I own one but I would not expect to find one in a sub 700. It is not a Xeon, but it is just 1 notch down from the $900 extreme edition so it is 2nd highest in consumer non server chips.

Well sites like tomshardware.com make it look like a Pentium or i3 can smoke the latest AMD black edition fresh out of the water. However, biased or not my real world experience says otherwise as many games are optimized for intel and use NVidia specific directX extensions with their studio software which boils a lot of AMD users blood but it is the truth.

In SWTOR I got a doubling of FPS from moving from a PhenomII black edition to an i7 4770k. True it has less cores but apps are optimized still for single tasking and I do have 4 real and 4 hyperthreaded cores for my VMs.

Reason being are games are crappy xbox ports. The 360 I think was single or dual core so games were single threaded. Therefore they kick ass on Intel. The only good news is the xboxONE is changing this with 8 cores with an AMD and forcing game makers to optimize more for ATI.

I expect the newer games to be more competitive as a result as they are more threaded and ATI optimized on tomshardware.com and other sites.

But damage is done and the power is much better with intel chips as they leave AMD further and further in the dust with lower chip nm sized dies since AMD sold their foundries. Global Foundaries only cares about ARM chips so sorry AMD stay in 2010 ... Intel is going 10nm next year and will finally put the last nail in the coffin. ... I pray NOT!

Comment If I accidently tread on a book (Score 1) 261

Then it will generally still work, will probably be cheap to replace, and in case it is damaged, it will still be at least nearly perfectly usable.  In the case of my android tablet that I used to use for this, I made the mistake of leaving in on the floor next to the power socket whilst on charge (short power cable and all that), was enthusiastically showing a friend round my toy collection (toy=laptop/workstation/synth/etc) and accidently put my chairleg down on my android tablet.  It still boots but touchscreen functionality doesn't work and, being a cheap tablet, usb otg didn't work properly anyway, so its now unusable.  If it were a real book, it wouldn't even have broken!  That's why I do not trust e-readers for books that are even remotely important: they are just too fragile and, even though I'm careful 99% of the time, there is that issue of the remaining 1% where even the most careful human doesn't have his (or her) brain engaged properly and is temporarily a complete klutz. Real paperware books are reasonably robust against issues of accidental clumsiness.  And robustness saves lives!  Seriously, suppose you're on the ISS and the only copy of the maintenance manual is accessible via an e-reader and you break it?

Comment Good luck with that (Score 1) 406

Overkill it may be, but I've been writing my prototype security code to generate new AES256 keys for each session, using the pre-generated keys only to initialize communications and handshake the generated keys. Even I won't know what keys are in use.

The NSA can kiss my ass. So can CSEC, GCHQ, and everyone else who thinks they have a "right" to spy on me.

Approach the service provider with a properly signed warrant in the appropriate jurisdiction of the server if you want access to my data.

Comment Re:Real Engineering (Score 1) 176

In some places it is illegal to call yourself an engineer if you isn't really one (unlike software "engineers").

Alright sounds fair. Why can't a consortium or guild provide this certification? For coders who need something done will be programmers who will earn less and therefore no need for H1B1s and for critical architects and SOA for critical projects you can have certified engineers?

We could have 2 grades rather than average both of them and not having enough talent for one, yet be too expensive for the other use?

Comment Re:I have an H1-B employee (Score 1) 176

Where are you getting the impression that inexperienced kids out of college are regularly getting $70,000 a year to start?

It seems you are just making stuff up to argue against. The odd part is that you would do that to argue against a rising standard of living for some group of working people. Are you the kind of person who wants to tear others down when they are getting more than you? Because that's a more harmful attitude to have for society in general than some businesses trying to pay less for more.

Really? Go look at people posting here and some of the job ads if you do not believe me that kids start at $60k a year?

Not to sound assholish but I thought only Indians would be doing IT and dropped out of CS to do business. Worst mistake ever! I am happy just to make 50k in a few years of experience so yes I do not believe they deserve a rise and HR and accounting need a way to conserve costs.

Only in slashdot is it discussed there are starving programmers all making 30k a year thanks to those horrible greedy H1B1 recruiters. I do not see it. I am being honest as I see it no different than CEOs whinning about making less than a million.

What is a good starting and experienced programmer salary? 100k a year? Unless they are specialized or own their companies a programmer should not make that much. Simple and the corporations are just trying to reballance the market. American programmers make nearly 2x as much as any other major so I have little sympathy.

Comment Re:I have an H1-B employee (Score -1) 176

No. The H1B debate is about creating an easy to exploit underclass. Even the "talented types" get abused by corporations. Corporations get a free pass to rape pillage and plunder because that's just (Ayn Rand) trendy these days.

Corporations want people that are easy to exploit. People with full legal status are harder to abuse. They also have higher expecations and higher overhead.

... ok put the emotion aside here. Play the view for the corporation for a minute?

Name one field besides the medical industry that pays college kids with 0 years experience $60,000 a year!!

I know college buddies who made $14/hr and lived with their parents after graduating from a bachelors in business?! They were lucky and happy to make that much as it was their first job. Correction doctors make $30,000 a year for 2 years first in residency.

Why should a programmer make as much as a doctor??

Supply and demand is why. Slashdotters may hate my argument here but look at it through the employers eyes who need stuff done? There is a shortage right now. Otherwise why should a kid who has never worked a day in his life out of school earn nearly $70,000 a year?

If you all hate Indians taking jobs you need to lower your salary more to what other majors make including even those in the medical industry. My brother can't find any programmers. The same ones were jerks and demanded $70,000 a year and now were all butt hurt when the great recession hit. The average programmer is worth $50,000 a year and HR caps it to that with 5 years experience in Memphis.

So basically kids out of school make $10,000 more than the average.. see an issue with this? H1B1 are needed to fill in then due to a lack of qualified candidates to positions.

Slashdot Top Deals

"Pay no attention to the man behind the curtain." -- Karl, as he stepped behind the computer to reboot it, during a FAT

Working...