Usually, "epoxy" around the edges of a BGA chip is neither an anti-hacking attempt nor a light-proofing attempt. It's called underfill, and its chief purpose is to increase mechanical strength and make the bond more durable than tiny bare solder balls would be on their own.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Yes they are. Most multimedia processing is parallelizable, and thus benefits greatly from SIMD instructions - for example, just about every CPU-based video codec ever. If you want an actual example, I wrote a high-performance edge detection algorithm for laser tracing, with its convolution cores written in optimized in SSE2 assembly, and am hoping to write a NEON version. It'll never run reasonably on the original Raspberry Pi because it's too underpowered to do it without SIMD (I didn't even bother writing a plain C version of the cores, because honestly any platforms without SSE2 or NEON are going to be too slow to use anyway).
Obviously you can use SIMD instructions for a lot more, but multimedia is the obvious example. And as I mentioned, the Pi makes up for it for standard codecs only with its GPU blob decoder, but that doesn't help you with anything that isn't video decoding (e.g. filtering).
ESP8266 only became a "thing" last year, so the community is still growing. But the manufacturer is cooperating and is releasing open SDKs, and the hobbyist community is enthusiastic about it. I personally intend to use a bunch of them to automate things around my apartment, so I guess I'll find out just how good/bad it is.
That's for developing on the ESP8266 core itself - if you just want to use the default firmware, plug it into your existing microcontroller platform (e.g. Arduino) and you get wireless connectivity and a TCP/IP stack (running on the module) with some trivial AT commands. Not as cheap since you're still using a separate core as the main app host, but still a really cheap way to add WiFi to something.
There's a difference between established industrial designs where there is an argument for maintaining compatibility and an existing codebase, and hobbyists which can quite happily move up the chain and are always looking for cool new stuff in other respects. Even in product development, some companies go out of their way to use ridiculously outdated, expensive chips. That usually only flies when it's for non-consumer applications where they can afford to throw more money at a chip vendor to keep making outdated chips at outdated prices (which sometimes even rise); for consumer products the competition will undercut you by using newer, cheaper chips if you don't. For hobbyists, it actually pays off to upgrade - you get better toolchains (no need to deal with all the ROM/RAM/pointer type shenanigans of AVRs on ARM), better debuggability, etc. Of course, it doesn't mean you should jump onto any random chip - the toolchains and ecosystems vary wildly in quality - but it's a shame that so many people just stick with the old instead of trying something new.
There's nothing wrong with the Tiny series - little 6- and 8-pin chips are still the market where AVR/PIC make perfect sense, and I'll be the first to admit that I've used a PIC12F629 as a dual frequency generator in a project. But as a flexible platform for hobbyists, I'd much rather have a Cortex-M3 over an ATmega. Back when I was using PICs more often, my approach was to, every few years, re-evaluate my personal selection of PICs. I'd go through Microchip's (extensive) part database, look at the prices, and see if anything caught my eye, then order some samples. My 8-pin of choice used to be 12F508, then 12F629. For 18-pin I went from 16F84 to 16F88. 28-pin, 16F876 to 18F2520 and 18F2550 for USB. 40-pin, 16F877 to 18F4520 to 18F4550 for USB. I tried dsPIC at one point but didn't like it; by then ARM was picking up steam and it didn't make any sense. I haven't really looked at their line-up in a while, since I've mostly moved on to other chips for interesting stuff and stick to my old PICs for small quick/dirty hacks since I have a bunch in my drawers to get rid of, but you get the idea. It never made any sense to me to get stuck with one particular obsolete part or range.
Yup, all the other aliexpress pages I was looking at for the same phone said MTK6517, and I didn't notice that the one that I chose was different (I was just going for the lowest price, though the difference was a few bucks). Turned out to be the more accurate one it seems, since it matches the actual device that I have.
A7 is actually decent. It's low-end (as far as ARMv7 application processors go) but reasonably modern (late 2011, which isn't too bad). Nobody's asking for a bleeding-edge CPU in something like the Pi, but a 2002 vintage core wouldn't have made any sense.
TFA used to claim that it was still ARM11. They just edited it a few minutes ago. I stand transitively corrected.
I actually tried to look up any official announcements to corroborate the fact that it was still ARM11 before posting my first comment (because it just felt so dumb), but found none, no mentions of the new chip on Broadcom's site, nothing. I guess they trusted El Reg with the scoop and they screwed it up.
It seems they just edited the article. It used to claim it was still ARM11.
Too bad Slashdot doesn't allow editing comments... oh well. I guess my first first post on
Whoops, you're right. Other pages claimed it was an MT6517, but I just checked
Processor : ARMv7 Processor rev 3 (v7l)
processor : 0
BogoMIPS : 2589.52
processor : 1
BogoMIPS : 2589.52
Features : swp half thumb fastmult vfp edsp thumbee neon vfpv3 tls vfpv4 idiva idivt
CPU implementer : 0x41
CPU architecture: 7
CPU variant : 0x0
CPU part : 0xc07
CPU revision : 3
Hardware : MT6572
(0xc07 means Cortex-A7)
If you want to control a few motors and lights with network connectivity, get some ESP8266 modules - those are WiFi modules with a user-programmable 80MHz 32-bit CPU that you can buy for $5. Throw in a Cortex-M0 as a slave device to control your I/O (which can be as cheap as $1 in single quantities - yes, you can get a 32-bit CPU for $1 these days). That is what 2015 state-of-the-art silicon gets you to fit the task. A Raspberry Pi with a WiFi dongle is an order of magnitude more expensive and overpowered (and yet underpowered relative to what it claims to be, which is a Linux platform).
You're confusing low-end with outdated. An ARM Cortex-M3 or M4 board would be a low-end board suitable for tasks such as motor control, while being reasonably modern, and cheaper than the Raspberry Pi. An ARM Cortex-A5 or higher would be modern and suitable for running Linux. ARM11 isn't low-end, it's high-end and outdated.
Raspberry Pi suffers from exactly the same problem as the Arduino: both are based on an ancient, woefully outdated platform. Just because performance is "good enough" for whatever your idea of "good enough" is, doesn't mean it makes any sense whatsoever to stick to cores that are 10 years old or older. Moving up to moder modern designs give you more bang for the same buck, or less buck for the same bang. In the silicon industry it just makes no sense whatsoever to lag behind 3 generations for something like this. Newer designs are built in newer process nodes, scale to higher frequencies, and cost less to manufacture for the same performance. Being at the bleeding edge of silicon is expensive, but drop down a generation or so (relative to whatever field you're interested in) and that's the price/performance sweet spot. Using older stuff just doesn't make sense.
This keeps happening over and over and over again. When I started embedded programming, back when the PIC16C84 was released (the first microcontroller to feature EEPROM program memory, soon followed by the PIC16F84 Flash version), it stirred up a hobbyist revolution. No longer did you need expensive EPROM burners, UV erasers, and expensive UV-windowed chips with an erase cycle measured in minutes! And yet 5 years later people were still using the same damn PIC16F84, with its sole timer and just about no other features, when you could buy a PIC16F88 for 2/3 the price and get three timers, built-in analog-to-digital conversion, serial port/UART, SPI/SSP, PWM, analog comparator, built-in 8MHz oscillator, more RAM and Flash,
Then the Arduino happened, and even more people people joined what became called the maker movement. And us longtime PIC users rolled our eyes because we'd been doing it for years and we didn't need no steenking breakout boards for a trivial 8-bit chip, but hey, C compilers for PICs sucked, and AVR was a better architecture anyway, and so Arduino deservedly became popular. But then the silliness started to set in again: ARM came up with Cortex-M3 and Cortex-M0, and you could buy a 32-bit chip running at 4x the clock rate for the same price as the AVR in the Arduino, and yet even today people keep using AVR-based Arduinos when the microcontroller world has moved on. People are even sticking FPGA shields on an Arduino, which is like sticking a GTX970 on a Pentium MMX. You could implement the entire AVR inside that FPGA and run it faster than the real one sitting underneath. Why this madness? Because Arduino is popular and people are scared to move on.
And now with Raspberry Pi it's the same thing all over again. When the Pi came out it almost had a good excuse, because, even though its CPU was obsolete, and Broadcom's idea of making a powerful GPU chip and sticking an old CPU "on the side" was dumb, let's face it, nobody was building Linux-capable SBCs at that price point. But that's no longer the case, you can buy much more capable boards for the same $35 today. Why on earth would they release an updated model with an updated chip in 2015 that still uses the same damn architecture that is 12 years out of date? It just makes no sense, the only reason I can come up with is internal politics at Broadcom (trying to sell off outdated chips/designs for cheap, resistance from their GPU division to having a more powerful CPU in there, or something like that).
Phones come with a touchscreen, speakers, microphones, WiFi, Bluetooth, GSM, and an accelerometer, which also cost money. Probably more money than a few connectors on a board.
Android has an NDK to develop native apps that target the CPU instruction set directly. Unreal Engine for Android isn't written in Java.
Well, the Chinese have managed to design a phone with a screen, dual radios, WiFi, Bluetooth, FM radio, and a dual core Cortex-A9 CPU that can be effectively sold for $40 or so (if you buy it in China, not online). If the Chinese can build in a CPU core that's two generations newer into a product with support for 3 radio standards and a screen that sells for $5 or so more than the Pi, why is Broadcom struggling with an outdated 12-year-old core on a product with no wireless?
You can buy a dual core Cortex-A9 Android phone in China for about $40, give or take. And that comes with a screen. Sorry, SoCs are dirt cheap these days and the price point isn't an excuse to ship a 12-year-old core (seriously, ARM11 came out in late 2002).
Why are they still shipping the same CPU core that was in the iPhone 2G? ARM is at least 3 generations ahead already. ARM11 doesn't have NEON (proper SIMD) instructions, so it's crap for multimedia processing (sure, they make it up for the usual codecs with their GPU core, but that doesn't help if you want to write your own code).
Seriously, when the Pi first came out one of my first complaints was that the CPU core was woefully outdated and I already owned several boards with much more recent ARM cores, and several years later they still haven't upgraded it? WTF? What sense does this make? Does Broadcom not have a license for more modern ARM cores? Are the licensing fees too high to ship it in a low-cost product? (Answer: no, plenty of Chinese SoC vendors are doing it). What's the issue?