Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Not worth automating at all, apparently (Score 2) 113

Let's assume those 400 people hired to handle paper were an inferior result, but they couldn't have been too horrible or the state would have been browbeaten into hiring more. So I'm going to spitball that 800 staff at an average of $70K per year each (with all bennies and burdens, they'd probably gross $50K), would cost $56 million a year...or $240 million over 4.2 years, not an indecent lifespan for a major web app these days.

So frankly, what's the point in automating at all, if it's going to be as expensive as a decent manual solution that would have been up and running in 3 months?

Comment F16 was the only fighter to fight the trend (Score 2) 192

Col. Jim Burton's "The Pentagon Wars" is back in print. While the Kelsey Grammer/Carey Elwes comedy movie is focused on their reluctance to test the Bradley Fighting Vehicle, much of the book is about the development of the F-16 by the "Fighter Mafia" - Col. John Boyd, Pierre Sprey, Chuck Finney, and designer Harry Hilliker - and how how hard they had to fight to get the F-16 built and accepted.
The F-16 hate in this forum could be coming from the 3- and 4-stars that wanted another standard Pentagon product: twice the weight and twice the price of the aircraft that came before it. But the F-16 was lighter and cheaper than the F-15 and focused laserlike on the job of dogfighting.
The F-35 has finally gone as far as you can go in the other direction: multi-multi-purpose, does everything, but the weight and especially the cost are almost comically bloated.

The question is not whether an F-35 could beat an F-16: it's whether a billion dollars of F-35 could beat a billion dollars of F-16s. And that's not even up for discussion.

Comment Re:Define "listening" (Score 4, Informative) 60

If you look up "WebCamGate" from 2010, when a school district was taking 60,000+ photos of students at home in their bedrooms by activating their school laptops, the administrator telling subordinates to conceal the surveillance, wrote back to a complainant with a note almost like that, "Why would we do such a thing? We would never do that!" Look up the name "Dimedio". So you'll have to forgive our skepticism.

Comment Re:A minor ephiphany (Score 1) 349

Thanks for that. The damning statement is how all these people that the rest of us regard actually more highly than rocket scientists - who haven't put anybody on the moon lately, and biomedical scientists could save our lives - are "computer illiterate".

There was this time when the excuse for being computer illiterate was age; the dang things just came up on business too fast. But now I'm the retired one, explaining simple Excel things to people 20 years younger. These "biomedical researchers" are mainly under 45, that is, had computers since Jr. High and Windows since college; they've had Excel to study for 20 years, all their careers.

I saw it with engineering - I was the formal IT guy for 7 years, then switched to become one of the engineers, albeit the local power-user and covert developer. I had expected to become obsolete as I aged, overrun by the superior expertise of people who grew up with computers, programming in elementary school. And there was ONE hacker, 20 years my junior, who could outstrip me on complex bits of configuration and development - and oddly enough, he had become a techie while a biomedical technician, writing Perl scripts to parse endlessly long DNA strings. But then there were nearly 100 engineers in the same company that would make the most eye-rolling mistakes and never even try to learn any underlying understanding of why the spreadsheet does certain things.

Over and over and over, I would correct something and try to teach some basics, but be put off with a request to just fix that exact problem, they were in a hurry. Not infrequently, they would be back in six months, asking me to do it again, "I forgot, I'm sorry, what was that again?" The uptake on a little bit of real instruction on the 2nd go-round was better, but still not 50%.

Poor understanding of how to use computer applications is still the greatest barrier to using computers to improve productivity.

Comment Re:Be a Licensed Profession, folks... (Score 1) 332

The same reason these companies expensively imported people rather than sending the work to their country?

The same reason you go to an American physician rather than to India?

The same reason you have your bridge designed by American engineers rather than Indonesians? (hint: different reason on that one. It's not legal to build the bridge. What if it weren't legal to put a car on American roads without software from licensed programmers? That applies to the rest of the engineering...)

Comment Be a Licensed Profession, folks... (Score 4, Interesting) 332

I'm wearying of it, but so far I just post the same thing over and over when I read about this topic. You don't see this with comparable white-collar high-knowledge professions like accounting, teaching, law, medicine and engineering. ...because they are all licensed.

This is not about unionism or protectionism. It's not holding onto the job for nationalism's sake or racism. Any race can get a license, indeed foreigners can be licensed - if they can pass the tests. Most of this outsourcing is not about putting in equivalent people; it's about being able to afford more of them and make up for the lower productivity and accuracy.

Information technology should be a licensed profession for multiple reasons; there are a lot of crappy local programmers that shouldn't have such jobs, too. This isn't about handy helpers or kid's games any more: our civilization depends on code that works right and we lose money, privacy and opportunity every day from IT failures. Medicine was not a licensed profession just a few generations back; it was licensed when it was time. For IT, it's now time.

Comment People confuse "old" and "bad" (Score 1) 674

The last several years of my career - which was as an engineer that did a lot of programming, not a programmer...but the IT department was so hard to get hold of, or get results from, I ended up doing a lot of "favours" around the office -- I did a bunch of web pages with perl scripts.

I never got into Javascript or even much DHTML, so these looked like 1993 web pages. They had a few simple forms with a couple of text boxes, a radio button set, etc. You could get customized reports with them that IT just didn't have time for...and I could customize in a new feature in an hour or two. So they grew like coral over a few years, and we ended up with several of them before the requests wound down.

IT was not happy...not because I'd used poor programming practice. (I have a CompSci degree too, I know how to comment and write clear code.)

No, IT thought it was awful because of the 1990's Perl/CGI-gateway architecture; only Microsoft tools should be allowed, this was unmaintainable, etc. ("Unmaintainable" continued after a junior engineer took over maintenance when IT wouldn't. He had the code figured out in a few weeks.)

I guess my point is you need to put "Bad" in quotes, because it's always an opinion...and lots of people mistake "bad" for "not the current fad".

The entire reason this worked is that these were *small* problems; IT could have done them easily, but each report only served a few people and IT was consumed with Big Systems that served everybody. That's also why there were just a few of them needed, it was a small "market". So I guess another point is that you don't always have to use the giant Official Corporate Development Environment Hammer to hit every nail. Those are chosen to be able to handle Big Problems, but the overall bureaucracy can be too heavy to nimbly solve small ones. Be open to small, simple, script-sized solutions. IT people constantly call those "Bad", usually with dire warnings that they will grow and become spaghetti-code monsters that will suck up all your money.

With respect, what the F do they know? They only get called in for those monsters. They may be unaware there are twenty times as many out there that did NOT grow into monsters and the small solution was just right and ran for a decade. So the next time you're pretty sure you just have a Small Problem, tell IT to stuff it and solve it yourself.

Comment Re:Its a continuation (Score 5, Insightful) 254

Something can look incremental but actually be pretty dramatic. We're kind of spoiled by Moore's Law having a doubling time of just a few years.

Increases in battery life have been "incremental" but also exponential - the increase has been something like 7% per year on the average, a ten-year doubling. And of course, we ate most of it with higher power consumption in most battery-powered devices: the phones, tablets and laptops. But look at how long something simpler like an iPod lasts now compared to 2001 and it's dramatic.

Electric cars are going get much more serious after one more doubling, and while the car companies would pay billions to have it happen overnight, it's still going to happen in 10 years even with the "incremental" progress.

Comment What? No "it's a trap" theories? (Score 3, Insightful) 706

Assange IS this story, and his name comes up here surprisingly seldom. Here's a game plan for you: Trump is an extreme case of conspiracy theorist; it wasn't just birtherism, he believes in a lot of them. A Clinton-murder conspiracy has got to be catnip to him right now, desperate as he is. So Assange is floating one: and if Trump bits, Assange plays him a bit, leads him on, his statements get more extreme...and Assange pulls out the rug.

Comment Moore's Law ended years ago, for many (Score 5, Insightful) 133

The author is the son-half of a father/son duo, Dan and Jerry Hutcheson, that wrote an article for Scientific American in 1996 on the expected coming end of Moore's Law, say around 2003-2005. It was one of the many that Intel liked to deride as they pushed on down below the wavelength of high-ultraviolet light in their form factors, a remarkable achievement.
And no doubt, Hutcheson will be in for more mocking about how Moore's will continue until we're using subatomic particles.

But for me, Moore's ended around the 2003-2005 they predicted. My big IT interest isn't phones and low-power computing, where Moore's is continuing - yes, possibly for longer than Hutcheson predicts -- but in raw desktop performance at number-crunching big databases. There's been progress there since 2005, but most of it has come from faster memory, SSDs, more cores. Raw horsepower progress continued, even exponentially - but not at a 2-year doubling after about 2005, it was more like 3, 4, then 5 years. I should have titled this, "Moore's law has been winding down for a decade, for many".

The new "Skylake" generation of i7's is mostly about low-power progress. A genuine jump for us power users is coming in the fall, I think, after a couple of years since the last one...and the chips should be 15% or 20% faster than 2014's. Just not like the late 90s and doublings every year or two.

Slashdot Top Deals

In every hierarchy the cream rises until it sours. -- Dr. Laurence J. Peter

Working...