Is there anyone outside M$ who didn't see that coming?
Slashdot videos: Now with more Slashdot!
Virus and antivirus suppliers have a symbiotic business relationship, each requires the other to continually make slow progress, rendering their old product useless, so they can sell their new product. If either side 'won', then they would cease being able to sell upgrades, their business model requires then not to win.
This was tried in Athens. What actually happens is that 2 car families who have the option no longer take the smaller, less polluting car half the time, and lots of 1 car families buy a really cheap clapped out, much more polluting car to use on alternate days.
After Bing and Zune, I think they'll continue with the 'rejected 60s Batman fight scene captions' theme, and it'll be Splork, Zoing or possibly Ptoink.
If you're going to edit an article, don't just cut out the least significant words, or you'll be left with nonsense.
According to the summary, this laser somehow generates power instead of consuming it, and it generates "3.2 million watts of power", which is "more than 1 petawatt".
Rather badly, I guess?
While English in the British Isles used to have mutually incomprehensible dialects, the influence of received pronunciation has drastically lessened this, so I'd argue that it is coalescing.
Most people I know who have strong regional accents have the ability to switch to a neutral accent and cut out dialect words when they are in formal situations - an ability my father's generation learned from radio, my generation learned from radio and television, and my children's generation will learn from radio, television and internet. If you ask a Hebridean Scot to give a transcript of two Cockneys talking in a pub, or vice versa, they will struggle, but if you arrange a conversation between a Hebridean Scot and a Cockney they will simply both 'talk like the people on the telly' and understand each other perfectly well.
English was able to fragment because in the past, you rarely had to communicate with people from far away (which is how we ended up with prominent Americans who can't even say their own names properly... yes, Jay-ZED, I'm looking at you!) The internet changes all this, we now have regular interactions with people worldwide, so speaking or writing in a mutually incomprehensible way has penalties.
Perhaps we should consider the benefits of formalising 'correct' English, lest we be doomed to forever be re-translating Wikipedia into 'current' English?
No, I have a television that does that for me.
So, it's pretty much exactly the same as the $40 Korg Nanokey 2 I've owned for years, but it's waterproof and costs $99?
Why exactly am I meant to be impressed?
We also have land mines, fully autonomous killing machines with no discretion at all. At least with killer robots there's a chance that they might decide not to kill you, or understand that a war is over.
Usually, "epoxy" around the edges of a BGA chip is neither an anti-hacking attempt nor a light-proofing attempt. It's called underfill, and its chief purpose is to increase mechanical strength and make the bond more durable than tiny bare solder balls would be on their own.
Yes they are. Most multimedia processing is parallelizable, and thus benefits greatly from SIMD instructions - for example, just about every CPU-based video codec ever. If you want an actual example, I wrote a high-performance edge detection algorithm for laser tracing, with its convolution cores written in optimized in SSE2 assembly, and am hoping to write a NEON version. It'll never run reasonably on the original Raspberry Pi because it's too underpowered to do it without SIMD (I didn't even bother writing a plain C version of the cores, because honestly any platforms without SSE2 or NEON are going to be too slow to use anyway).
Obviously you can use SIMD instructions for a lot more, but multimedia is the obvious example. And as I mentioned, the Pi makes up for it for standard codecs only with its GPU blob decoder, but that doesn't help you with anything that isn't video decoding (e.g. filtering).
ESP8266 only became a "thing" last year, so the community is still growing. But the manufacturer is cooperating and is releasing open SDKs, and the hobbyist community is enthusiastic about it. I personally intend to use a bunch of them to automate things around my apartment, so I guess I'll find out just how good/bad it is.
That's for developing on the ESP8266 core itself - if you just want to use the default firmware, plug it into your existing microcontroller platform (e.g. Arduino) and you get wireless connectivity and a TCP/IP stack (running on the module) with some trivial AT commands. Not as cheap since you're still using a separate core as the main app host, but still a really cheap way to add WiFi to something.
There's a difference between established industrial designs where there is an argument for maintaining compatibility and an existing codebase, and hobbyists which can quite happily move up the chain and are always looking for cool new stuff in other respects. Even in product development, some companies go out of their way to use ridiculously outdated, expensive chips. That usually only flies when it's for non-consumer applications where they can afford to throw more money at a chip vendor to keep making outdated chips at outdated prices (which sometimes even rise); for consumer products the competition will undercut you by using newer, cheaper chips if you don't. For hobbyists, it actually pays off to upgrade - you get better toolchains (no need to deal with all the ROM/RAM/pointer type shenanigans of AVRs on ARM), better debuggability, etc. Of course, it doesn't mean you should jump onto any random chip - the toolchains and ecosystems vary wildly in quality - but it's a shame that so many people just stick with the old instead of trying something new.
There's nothing wrong with the Tiny series - little 6- and 8-pin chips are still the market where AVR/PIC make perfect sense, and I'll be the first to admit that I've used a PIC12F629 as a dual frequency generator in a project. But as a flexible platform for hobbyists, I'd much rather have a Cortex-M3 over an ATmega. Back when I was using PICs more often, my approach was to, every few years, re-evaluate my personal selection of PICs. I'd go through Microchip's (extensive) part database, look at the prices, and see if anything caught my eye, then order some samples. My 8-pin of choice used to be 12F508, then 12F629. For 18-pin I went from 16F84 to 16F88. 28-pin, 16F876 to 18F2520 and 18F2550 for USB. 40-pin, 16F877 to 18F4520 to 18F4550 for USB. I tried dsPIC at one point but didn't like it; by then ARM was picking up steam and it didn't make any sense. I haven't really looked at their line-up in a while, since I've mostly moved on to other chips for interesting stuff and stick to my old PICs for small quick/dirty hacks since I have a bunch in my drawers to get rid of, but you get the idea. It never made any sense to me to get stuck with one particular obsolete part or range.
Yup, all the other aliexpress pages I was looking at for the same phone said MTK6517, and I didn't notice that the one that I chose was different (I was just going for the lowest price, though the difference was a few bucks). Turned out to be the more accurate one it seems, since it matches the actual device that I have.
A7 is actually decent. It's low-end (as far as ARMv7 application processors go) but reasonably modern (late 2011, which isn't too bad). Nobody's asking for a bleeding-edge CPU in something like the Pi, but a 2002 vintage core wouldn't have made any sense.