What I find interesting is that despite the popularity of Android apps written in Java, Java was included as a "legacy" language by the article.
Apparently the writer of the original article thinks "legacy" means that you have to maintain and enhance existing applications instead of developing new ones.
To me, "legacy" means that there are no new applications being developed in that language, and the only jobs available for it are maintaining and enhancing existing applications.
And a pilot who loses an eye does so well without it's sensor, right?
That presumption seems to be precipitated on the theory that a computer intelligence won't "grow" or "learn" any faster than a human. Once the essential algorithms are developed and the AI is turned loose to teach itself from internet resources, I expect it's actual growth rate will be near exponential until it's absorbed everything it can from our current body of knowledge and has to start theorizing and inferring new facts from what it's learned.
Not that I expect such a level of AI anytime in the near future. But when it does happen, I'm pretty sure it's going to grow at a rate that goes far beyond anything a mere human could do. For one thing, such a system would be highly parallel and likely to "read" multiple streams of web data at the same time, where a human can only consume one thread of information at a time (and not all that well, to boot.) Where we might bookmark a link to read later, an AI would be able to spin another thread to read that link immediately, provided it has the compute capacity available.
The key, I think, is going to be in the development of the parallel processing languages that will evolve to serve our need to program systems that have ever more cores available. Our current single-threaded paradigms and manual threading approaches are far too limiting for the systems of the future.
Which phones with 128MB or 256MB of RAM run a modern version of Android?
KDE on Debian or any other distro tends to provide the most "XP like" user interface that I've seen. You just need to enable double-click mouse behaviour instead of the default single-click, add a few of their favourite apps to the desktop, and they're good to go.
If you're on an old system, you'll want to disable the file indexing daemons as well, as they can consume a lot of CPU and slow the machine down. If all the main user does is email and web browsing, they're not going to benefit from the indexing.
I download about 30 hours a week, but I don't actually watch any of it. I just archive it in case I'm ever bored so I have something to watch should I ever actually want to numb my brain.
And most of what I actually do watch isn't new material, but stuff that's been off the air for a few years -- like Red Dwarf.
Here's the problem: most open source software isn't owned by US authors. So the software is developed and maintained with absolutely no concern about anal-retentive American military "requirements." You can hardly take a global project and demand that people from certain nations stop contributing so that you can ship the software to a US market without getting into trouble for "conspiring" with those nations.
Quite frankly, the law is asinine anyhow. There are no shortage of places around the globe to download and access the full code and binaries of "restricted" software from those nations, because there are other nations who participate in open source projects that don't kiss American ass.
So as far as I'm concerned, RedHat is doing what is necessary to continue using open source software.
To truly meet the American legal requirements, they'd have to rewrite and lock down an insane amount of software -- including replacing the Linux kernel.
You have an interesting attitude considering that every license I've ever seen revokes your right to use the software if you breach the license terms. The terms vary; the penalty of not being able to use the software is across the board.
I'm not a lawyer either, but FYI even if the judge had agreed to dismiss the charges, that would not be binding on other courts either. It would not have become binding unless one side or the other appealed and the circuit court and got a decision there. That decision would then become binding on *only* that circuit.
I don't check libraries for security vulnerabilities. I check websites for information about that, and to see how often the provider is refreshing the library with patches and fixes.
If I don't get the feeling that they take their security seriously, I don't use the library. I'm not about to start testing every library of the OS that I build against, nor the Java stack itself. To do so is asinine unless you're in an extremely high security arena -- you have to start with a certain level of trust, and if you don't trust your vendor, don't use them.
Besides, not one of the binary analysis tools I've ever heard of did a really good job. Even source code analysis can miss bugs. If it were possible to fully automate testing in such a fashion, testers wouldn't have jobs.
Ah. I get it. The coupon site sponsored the "research."
Well, if they did their research by calling their own customers, no wonder we got the low end of the IQ scale responding.
Wait a minute! Where's this "coupon site" people were talking about? This one is an LA Times article. They may be ad supported, but they're hardly one of the shady coupon distribution sites. Did the article get re-linked to a more reputable source?
I know significant numbers of the over 60 population who avoid and ignore all things digital save for their satellite TV receivers. And the only reason they have those is because cable wasn't available in their area.
Yeah, sure, I could laugh and point at the "dumb Americans", but it's not dumb Americans -- it's dumb people, and we've no shortage of them around the world. After all, as George Carlin pointed out: Think about how stupid the average person is, and remember that half the population is dumber than that.
Besides, as many have already pointed out, this whole article is clearly a slashvertisement to give eyeballs to a piece of shit coupon site.
The fellow who wrote the original code used a library I'd never heard of for MySQL connectivity. They didn't know how to use SQL properly. They didn't know how to error check results. Hell, they didn't even know how to sort data for the users as they'd been asking him to for months before.
But no, he left the company and the steaming pile of crud was dropped in my lap to fix.
By the time I was done stabilizing the thing, there must have been a whole 10% of the original code left.
Just because it's possible to write readable and maintainable PHP doesn't mean it happens any more often than with PERL.
I've never started a PHP project, but I've been called on to fix several.
Nowadays I deny any and all knowledge of PHP and refuse to get suckered into fixing someone else's hack job of code ever again.
PHP sucks farts off dead chickens in the hands of an amateur, and 99% of the people who "recommend" PHP are amateurs.