Become a fan of Slashdot on Facebook


Forgot your password?
Compare cell phone plans using Wirefly's innovative plan comparison tool ×

Comment Re:A minor ephiphany (Score 1) 281

Thanks for that. The damning statement is how all these people that the rest of us regard actually more highly than rocket scientists - who haven't put anybody on the moon lately, and biomedical scientists could save our lives - are "computer illiterate".

There was this time when the excuse for being computer illiterate was age; the dang things just came up on business too fast. But now I'm the retired one, explaining simple Excel things to people 20 years younger. These "biomedical researchers" are mainly under 45, that is, had computers since Jr. High and Windows since college; they've had Excel to study for 20 years, all their careers.

I saw it with engineering - I was the formal IT guy for 7 years, then switched to become one of the engineers, albeit the local power-user and covert developer. I had expected to become obsolete as I aged, overrun by the superior expertise of people who grew up with computers, programming in elementary school. And there was ONE hacker, 20 years my junior, who could outstrip me on complex bits of configuration and development - and oddly enough, he had become a techie while a biomedical technician, writing Perl scripts to parse endlessly long DNA strings. But then there were nearly 100 engineers in the same company that would make the most eye-rolling mistakes and never even try to learn any underlying understanding of why the spreadsheet does certain things.

Over and over and over, I would correct something and try to teach some basics, but be put off with a request to just fix that exact problem, they were in a hurry. Not infrequently, they would be back in six months, asking me to do it again, "I forgot, I'm sorry, what was that again?" The uptake on a little bit of real instruction on the 2nd go-round was better, but still not 50%.

Poor understanding of how to use computer applications is still the greatest barrier to using computers to improve productivity.

Comment Re:Be a Licensed Profession, folks... (Score 1) 331

The same reason these companies expensively imported people rather than sending the work to their country?

The same reason you go to an American physician rather than to India?

The same reason you have your bridge designed by American engineers rather than Indonesians? (hint: different reason on that one. It's not legal to build the bridge. What if it weren't legal to put a car on American roads without software from licensed programmers? That applies to the rest of the engineering...)

Comment Be a Licensed Profession, folks... (Score 4, Interesting) 331

I'm wearying of it, but so far I just post the same thing over and over when I read about this topic. You don't see this with comparable white-collar high-knowledge professions like accounting, teaching, law, medicine and engineering. ...because they are all licensed.

This is not about unionism or protectionism. It's not holding onto the job for nationalism's sake or racism. Any race can get a license, indeed foreigners can be licensed - if they can pass the tests. Most of this outsourcing is not about putting in equivalent people; it's about being able to afford more of them and make up for the lower productivity and accuracy.

Information technology should be a licensed profession for multiple reasons; there are a lot of crappy local programmers that shouldn't have such jobs, too. This isn't about handy helpers or kid's games any more: our civilization depends on code that works right and we lose money, privacy and opportunity every day from IT failures. Medicine was not a licensed profession just a few generations back; it was licensed when it was time. For IT, it's now time.

Comment People confuse "old" and "bad" (Score 1) 671

The last several years of my career - which was as an engineer that did a lot of programming, not a programmer...but the IT department was so hard to get hold of, or get results from, I ended up doing a lot of "favours" around the office -- I did a bunch of web pages with perl scripts.

I never got into Javascript or even much DHTML, so these looked like 1993 web pages. They had a few simple forms with a couple of text boxes, a radio button set, etc. You could get customized reports with them that IT just didn't have time for...and I could customize in a new feature in an hour or two. So they grew like coral over a few years, and we ended up with several of them before the requests wound down.

IT was not happy...not because I'd used poor programming practice. (I have a CompSci degree too, I know how to comment and write clear code.)

No, IT thought it was awful because of the 1990's Perl/CGI-gateway architecture; only Microsoft tools should be allowed, this was unmaintainable, etc. ("Unmaintainable" continued after a junior engineer took over maintenance when IT wouldn't. He had the code figured out in a few weeks.)

I guess my point is you need to put "Bad" in quotes, because it's always an opinion...and lots of people mistake "bad" for "not the current fad".

The entire reason this worked is that these were *small* problems; IT could have done them easily, but each report only served a few people and IT was consumed with Big Systems that served everybody. That's also why there were just a few of them needed, it was a small "market". So I guess another point is that you don't always have to use the giant Official Corporate Development Environment Hammer to hit every nail. Those are chosen to be able to handle Big Problems, but the overall bureaucracy can be too heavy to nimbly solve small ones. Be open to small, simple, script-sized solutions. IT people constantly call those "Bad", usually with dire warnings that they will grow and become spaghetti-code monsters that will suck up all your money.

With respect, what the F do they know? They only get called in for those monsters. They may be unaware there are twenty times as many out there that did NOT grow into monsters and the small solution was just right and ran for a decade. So the next time you're pretty sure you just have a Small Problem, tell IT to stuff it and solve it yourself.

Comment Re:Its a continuation (Score 5, Insightful) 254

Something can look incremental but actually be pretty dramatic. We're kind of spoiled by Moore's Law having a doubling time of just a few years.

Increases in battery life have been "incremental" but also exponential - the increase has been something like 7% per year on the average, a ten-year doubling. And of course, we ate most of it with higher power consumption in most battery-powered devices: the phones, tablets and laptops. But look at how long something simpler like an iPod lasts now compared to 2001 and it's dramatic.

Electric cars are going get much more serious after one more doubling, and while the car companies would pay billions to have it happen overnight, it's still going to happen in 10 years even with the "incremental" progress.

Comment What? No "it's a trap" theories? (Score 3, Insightful) 705

Assange IS this story, and his name comes up here surprisingly seldom. Here's a game plan for you: Trump is an extreme case of conspiracy theorist; it wasn't just birtherism, he believes in a lot of them. A Clinton-murder conspiracy has got to be catnip to him right now, desperate as he is. So Assange is floating one: and if Trump bits, Assange plays him a bit, leads him on, his statements get more extreme...and Assange pulls out the rug.

Comment Moore's Law ended years ago, for many (Score 5, Insightful) 133

The author is the son-half of a father/son duo, Dan and Jerry Hutcheson, that wrote an article for Scientific American in 1996 on the expected coming end of Moore's Law, say around 2003-2005. It was one of the many that Intel liked to deride as they pushed on down below the wavelength of high-ultraviolet light in their form factors, a remarkable achievement.
And no doubt, Hutcheson will be in for more mocking about how Moore's will continue until we're using subatomic particles.

But for me, Moore's ended around the 2003-2005 they predicted. My big IT interest isn't phones and low-power computing, where Moore's is continuing - yes, possibly for longer than Hutcheson predicts -- but in raw desktop performance at number-crunching big databases. There's been progress there since 2005, but most of it has come from faster memory, SSDs, more cores. Raw horsepower progress continued, even exponentially - but not at a 2-year doubling after about 2005, it was more like 3, 4, then 5 years. I should have titled this, "Moore's law has been winding down for a decade, for many".

The new "Skylake" generation of i7's is mostly about low-power progress. A genuine jump for us power users is coming in the fall, I think, after a couple of years since the last one...and the chips should be 15% or 20% faster than 2014's. Just not like the late 90s and doublings every year or two.

Comment You mean "documentation" ? (Score 1) 239

I also write notes to people, like this one, with the usual amount of English that isn't really necessary - like a quarter of the words in this sentence.

A certain amount of redundancy in communication is still around, though, yes, languages tend to drop and slur words over the centuries. Interestingly, military communications, where you'd think speed was very important, tend to be written quite formally, that is to say, with lots of that redundant English. Turns out that clarity is even more important than speed when lives are on the line!

I bet that your really important code that runs SCADA and other real-word systems has the most ponderous, overstated, tedious and obvious code of all. The same way that surgeons say, "Hand me the #3 rib spreader" rather than "Gimme that" while gesturing in a general direction.

Actually declaring and destroying your variables - stating what they are in clear, rather than by implication, and making clear when they are no longer needed - should be considered documentation for the programmers, even if the mechanism will perform identically without it.

Comment Re:Impressive (Score 1) 106

> What telecoms — correctly — object to, are efforts by local governments to compete with them. ...yeah, they don't permit competition from their subsidiaries.

If that were something besides pro-telecom BS, there would be more than two competing businesses, individuals, or non-profits in most American markets. America's the Land of Entrepreneurs - you don't think anybody in America had this guy's idea? Those folks were almost all shut down, generally by clubbing them with a compliant government that works for the industry.

So we always have just the two offerings, who have, mysteriously, the same price, though they use completely different infrastructures. Just like TV happens to cost the same whether delivered by cables that were paid off by the early 90s, or satellites 40,000km overhead. What are the odds such different technologies would cost exactly the same to the consumer?

Bottom line, you don't get to use the "compete" word about the telecom, cable, or internet industries in the US. They are not competitive, compared to world-wide figures, because they simply do not compete with each other; they divide up markets, send each other signals as to the common price, and enjoy high profits as rentiers who own an oligopoly.

The Spanish market is competitive, *by comparison*, and yet it's massive companies that should be able to beat a bunch of hobbyist amateurs with their economies of scale and PhDs by the squad...but instead the hobbyist amateurs are beating them at their own game. Because even they are not all that competitive.

When there are more than six providers competing in a marketplace, you can use the "C" word to describe the situation; so says classic economics theory as confirmed by many, many observations in many markets for many products. Fewer than six, and they don't have to meet in a smoke-filled room to agree on pricing; the signals are sent in the pricings themselves, and fewer than six can quietly agree not to get cut-throat.

Comment The writers hadn't thought that far (Score 1) 359

Economist Manu Saadia recently wrote "Trekonomics" about the supposed economics of the ST Earth back home. There was a panel discussion at a con featuring Saadia's friend Paul Krugman and Chris Black, a writer for ST: Enterprise. He was asked what the writers were thinking the character's motivations were in a post-scarcity economy and begged off: in short, it never crossed their minds.
Neither set of writers gave a care for why the background they needed for their story was the way it was, any more than the guy painting a backdrop of a hilly landscape for a play cares about the geology that produced the landscape.

Comment Re:Great. Want 5,000 of them? (Score 1) 180

Cheer up. Storing (collecting, stacking, housing, guarding, insuring) 5000 terminals for 15 years would probably cost more than $25,000 per year, which with interest would negate your $500,000 in today's value.
*People* in their homes can make use of "unused storage area" in their basements (until the basement is a horror) to keep around something that'll be maybe useful way down the line, but corporations can't afford to. Save that you could keep a few bits of equipment stashed in your own office area, like everybody.

Slashdot Top Deals

"Ada is PL/I trying to be Smalltalk. -- Codoso diBlini