Follow Slashdot stories on Twitter


Forgot your password?
Slashdot Deals: Deal of the Day - Pay What You Want for the Learn to Code Bundle, includes AngularJS, Python, HTML5, Ruby, and more. ×

Comment Re:old clunky junk (Score 1) 170

From a hobbyist perspective, building that board is very hard.

The thing is, it really isn't. Assuming you're wiring things up on a breadboard, what you do is:

  • Put the AVR on the breadboard
  • Temporarily wire up an Arduino as a bootstrap programmer, and use the appropriate sketch to flash the bootloader onto your target AVR
  • Connect 5V and GND
  • Plug in a quartz crystal (if needed) to the xtal pins
  • Connect a USB to serial dongle/adapter to the serial pins
  • Done.

Sure, it's a couple more steps than "Connect Arduino to USB", but it's fewer steps than the average project requires for everything else.

Comment Re:old clunky junk (Score 1) 170

Yes, I was thinking of the Gameduino shield - it's completely silly. It even has a coprocessor inside the FPGA that is user-programmable and runs much faster than the Arduino. The SPI interface is a stupid bottleneck. It would make a lot more sense if they hadn't jumped on the Arduino shield bandwagon and instead had implemented it as a stand-alone product with its own CPU core (which could just as well use the Arduino libs if they'd wanted; it wouldn't require FPGA knowledge either).

Think about it - the Gameduino is the same idea as taking a modern GPU and connecting it to a 386 through the serial port.

Comment Re:old clunky junk (Score 1) 170

I agree that if you're just plugging shields together and if you don't care about size or cost and if you're not building any custom electronics then the Arduino makes sense, even in production.

But really, I'm not focusing on commercial production here, I'm talking about hobbyists. Not those plugging in shields, but those designing their own. And making their own widgets using the Arduino as a base for their own custom electronics design. Often, people who are already designing their own boards, or at least doing permanent soldered together prototypes. People who by all measures are capable of throwing together a bare AVR... they just don't know it. Or people who randomly decide to make something useful and do a small run, perhaps still assembled by them but building more than a couple of units, and are too scared to learn how to route a tiny board with a micro and whatever else they need on it, because they don't realize just how easy it is. There is a very significant cargo cult culture here.

Code-wise, it is perfectly possible to use the Arduino libraries and software ecosystem with bare chips. That's also something that many people don't realize.

Comment Re:old clunky junk (Score 1) 170

So what are you plugging into the Arduino?

I'm not talking about the person who is still at the plug-shields-together stage and doesn't really know how to put more than 4 parts on a breadboard. Nor am I talking about some kind of custom solutions consultant who really doesn't care about efficiency and just wants to deliver the product ASAP with a minimum of effort and doesn't need it to be cost-optimized.

I'm talking about curious makers who start building things on top of the Arduino platform, understand basic digital electronics... but then somehow never move on from basing everything around the Arduino board, and think there is some kind of magical pixie dust in there and that running a bare chip is rocket science. Of which there unfortunately are a lot. People somehow end up doing truly interesting/novel designs that still have a "here goes the arduino" mentality as if it were some kind of religion.

Comment Re:old clunky junk (Score 1) 170

So, the FPGA shield is not as stupid as it could feel.

The dumb part is that you can stick an AVR-compatible core in the FPGA itself and skip the Arduino. Basically, an FPGA shield for an Arduino makes no sense when you could have a standalone FPGA board that is also Arduino-compatible by virtue of embedding an Arduino clone inside the FPGA itself. That would be a much better way of jumping on the bandwagon without doing something that is, design-wise, completely silly. It would run faster than a real AVR too.

Even if you can program a microcontroller in assembler, using direct port access, don't forget that not everyone can/want to do it. Arduino is often used by people who can barely program but need some way to sequence things (artists for example).

There's a threshold here. If you're using Arduino + off-the-shelf shields as a platform and not actually learning electronics, that's fine. That's a different story. Nobody expects you to roll your own board if you care more about the software or artistic side and don't care about the hardware.

But if you're using an Arduino and building your own designs around it, i.e. understand how to use the "black box" that is the Arduino and how to interface with it, then you really need to stop thinking about it as a black box and realize that you can just as well build your own design around a microcontroller, possibly the same one and using the same exact software initially. And then you have many more possibilities than you did when you were limiting yourself to off-the-shelf devboards.

Someone who makes a led blink under arduino has learned the basics of loop and sequence of instructions... Shields and other will help to go further...

Nothing wrong with that either. Again, to clarify, I'm talking about people who inexplicably learn enough about electronics to make their own designs around the Arduino yet refuse to ever touch a bare microcontroller for some reason, or anything that doesn't have the Arduino label on it. If you're just using off the shelf blocks and focusing on the software or using it as a beginner's learning tool then there is nothing wrong with that. I'm not saying Arduino should cease to exist or has no purpose. I'm saying there is a sizable segment of the maker community who inexplicably revere it like it's the be-all-end-all of hobby electronics and don't understand just how irrelevant it becomes after you learn the very basics of building your own hardware.

But it's also true that the number of available chips is huge and selecting one may feel difficult (mostly when only basic functionalities are needed) So I can understand that people will end up stocking one or two "generic" models and stick to them.

I was in that situation too, using the 16F84. And then one day I figured out that I could literally just go to the Microchip parts selector, plug in what features I needed, and pick the cheapest chip that fit the bill (or throw a few more things in just in case). I think one thing that is sorely lacking is documentation/tutorials on "from-scratch" circuit design - such as how to select a microcontroller from the thousands available.

But there is also the fact that the authors of learning material need to stop sticking to ancient crap. For example, the PIC16F88 was a fine replacement for the PIC16F84 when it came out (2003), with a lot more features in the same form factor for cheaper. There was literally no excuse to use the 16F84 ever in any new design or new educational material after that (there were other chips before it that fit the bill too, that's just the one that comes to mind and one that I used a lot myself). The people building the tools, devkits, and writing the docs need to start actually using devices that are current as of the time they make their design.

Now, sticking to families that you know is more reasonable, because there are many unknowns in jumping to a new chip family and you have to be willing to re-learn a lot of details. But moving between chips in the same family should be one of the things we teach anyone building their own circuits.

Arduino nano compatible bought from China end up very cheap with USB, voltage regulator, quartz... If you buy some quantity, you may drop below 1.75$/module. And the module is not much bigger than a DIP40... so yes, Arduino is a viable option at least for medium-sized production.

But... why? Why use a module when you can just stick the AVR right on your board and get a lot more flexibility? It makes a lot more sense to buy the USB-TTL converter as a module, since at least that is pretty much universal. Or just use a serial connector and have the USB-TTL converter as an external cable for testing/programming if your device doesn't rely on USB being around to actually function.

Comment Re:old clunky junk (Score 1) 170

I don't see anything wrong with a USB HID interface for NES controllers. That's pretty much what AVR-style micros are just right for. Assuming you're using an AVR board with a native USB uC of course, like a Teensy++ or similar. I've done similar things myself, both using custom PIC designs and off the shelf AVR breakout boards. Also, LUFA, which I assume you're using, is great and in general much higher quality code than a lot of Arduino stuff.

If you're using software/bit-banged USB, you really should look into doing it right with a micro that has built-in USB. But that's by no means the worst micro abuse I've seen (and there is actually an argument for doing bit-banged low-speed USB in super low cost scenarios).

Similarly, although I'm not a huge fan of the Raspberry Pi for other unrelated reasons, it's a perfectly fine fit for NES emulation.

Further, another micro for power management isn't too far off either. It's separate enough that I wouldn't want to throw it onto the micro doing HID. I would encourage you to eventually design a more integrated power control board without using a full-blown arduino for it though, and learn to do it using MOSFETs and the like instead of relays, but that's a learning path. I've used my share of relays for quick and dirty power control.

The abuses that I had in mind are nothing like that. It's things like people using an Arduino to run game logic for a video engine implemented in an FPGA (where they could just implement the micro itself in the FPGA and get something much faster than the Arduino without the Arduino), or, on the other end of the spectrum, people using more than one Arduino to do what is effectively blink LEDs (as a one-off/temporary project it's fine, but if you're doing more than one or doing it permanently you really should stop being scared of using bare microcontrollers and learn how to make your own design closer to the requirements - an Arduino is nothing more than a voltage regulator, a USB to serial bridge chip, and a bunch of wires). Or people building entire commercial products around Arduinos with no particular excuse for using them (like compatibility with other projects/modularity), just because they don't know any better and they are too scared to stick a bare chip on a breadboard.

Yes. I'd probably consider 10 or maybe even 20 to be the cut-off for putting more effort into it.

This tells me that you have the right idea, you just need to get over the mental barrier. I deliberately made the threshold low because it really is stupidly easy to use bare chips. An Arduino is nothing more than an AVR with pins broken out, a voltage regulator, and a USB-to-TTL-serial converter onboard. You can get exactly the same effect by sticking an AVR into a breadboard with a 7805, and an external USB to TTL dongle/breakout board. And since for a great many projects you don't need USB, you can keep that external as a debugging aid only. Voila, welcome to the most basic microcontroller circuit: power and ground. You literally don't need anything else (an external clock crystal is helpful for clock accuracy but not required for many applications).

Personally, my threshold is 1. I will use dev boards for microcontroller design prototyping but if I'm ever making more than one, even for myself, I'll roll my own thing. Sometimes I don't even bother prototyping it with a dev board and go straight to a quick and dirty but stripboard build or similar, if it's a one-off but so simple that I know it will work. I mean, why use a clunky Arduino or other dev board when all I really need is an 8-pin chip for a tiny task?

Comment Re:old clunky junk (Score 1) 170

This question is shit. How many more? 50? 500? It doesn't make sense to actually design your own circuit until the number gets up there someplace. Certainly at 5 I would just buy the Arduinos.

I would do my own design at 2. Sometimes at 1.

What you and the people with this problem don't realize is that it's downright trivial to stick the same micro that's on the arduino on a broadboard, give it 5V and ground, and it'll run. You're presumably already designing the rest of the circuit that plugs into the Arduino. Skip the damn thing and just use the chip that it contains directly! The Arduino is a trivial piece of electronics. Count the parts on it. The most complex part of it is the USB to serial converter. You should buy a bunch of those from China (that's what I do, always keep one in my backpack too), design plain TTL-serial ports into your circuits, and just use the converter when you need to talk to the chip (most designs do not need to be in constant communication with a PC to work).

Yeah, all that costs a lot more than just buying some $3 Arduino Nanos from China

No it doesn't. All it takes is to take the pins on your design that say "Arduino goes here" and instead plug in a bare AVR chip. Maybe give it a clock crystal and a voltage regulator. Suddenly, you don't need an Arduino any more. And now you have the freedom to pick a different chip model or family if it fits your design better, if you want. If you insist on treating the Arduino like a magical black box instead, you're not just throwing away money, you're refusing to learn. If curiosity got you interested in making electronics to begin with, why be scared of what's inside the box? The Arduino doesn't even have a case, it's not even hard to see that it really is just very few parts on a board!

Comment Re:old clunky junk (Score 1) 170

I completely agree about the chip choice issue. At least it's not PIC16/18, which were horrible for C (especially PIC16), but the maker world really needs to move on to ARM and Cortex-M0. However, even worse than the ancient chip choice are the people blatantly abusing it to do things it's just a horrible choice for.

For the 5-100 crowd, you can get a small board with an ATMega on it, which costs less and takes up a lot less space in your project.

The thing is at that point you might as well use the bare chip! I really don't get this attachment to the Arduino, when all it really is is a voltage regulator, a USB to TTL chip, and a bunch of traces on a board. I think there is a huge number of people that don't realize that you can just take the same AVR that's on the Arduino, stick it on a breadboard, give it 5V and ground, and it'll run. Design your project with a TTL-serial port and just use an external USB UART dongle to talk to it when you need to.

There's nothing wrong with prototyping with dev boards like the Arduino, but way too many people build the things into permanent boxes or, worse, commercial products. That's not what they're intended for, or at least not what they should be intended for.

Comment Re:old clunky junk (Score 1) 170

We all know Arduino makes things easy. The problem is there is a huge segment of the maker community who get all excited about the blinkenlights but have no drive to venture outside their comfort zone, and another huge segment which makes no attempt at providing the tools, know-how, or encouragement for them to do that, and instead just keeps building on the hype-du-jour even when it doesn't make sense. This is how you end up with a large chunk of the community being 10 years behind the curve, or who will keep abusing Arduinos for things they're terrible/underpowered/overkill/otherwise unsuited for because "Arduino!" (one of the worst offenders I saw was an FPGA shield - which was tens of times more powerful than the Arduino it would sit on top of).

This is not new. Arduino isn't new either, it's just the one product/community that got popular. I was programming microcontrollers on breakout boards (you know, just like the Arduino) with built in ISP programmers (you know, just like the Arduino, except we didn't have bootloaders back then) back when these things still connected via parallel port. And the problem was still there. Everyone was learning the PIC16F84 because that's what everyone knew, even though it was grossly out of date, and you could buy a newer chip with dozens of new features, many times the memory, for less money, in a pin-compatible package, almost source-compatible (all you had to do was change the processor header file and add a couple of init instructions to turn off some new features). But people still stuck to the PIC16F84, even though there is absolutely no logical reason to do so, because of momentum. And then people kept doing dumb things, like using external R-C circuits as a horrible analog to digital converter, even though pretty much any newer chip had a built-in ADC, because the PIC16F84 didn't.

Ask yourself this question: if you want to actually build 5 or more of a particular project, would you just get an Arduino for each? Or would you give a shot at researching the available microcontrollers, perhaps sticking to the AVR series, perhaps not, picking the right one, then making a custom board design, trying to optimize it a bit, and probably end up coming up with a much more robust, compact, and efficient design as a result, and learning a lot in the process? If your answer is the latter, then you aren't part of the problem. If your answer is the former, then you are.

Comment Where are the advantages? (Score 4, Insightful) 170

I have no doubt that old-school TIP series transistors still have plenty of uses today, but the article is completely devoid of any examples. All it is saying is "look, these things aren't unusably bad for driving motors - they're just bad." Tom's post is still dead-on - using old school NPN BJTs for switching heavy loads today is completely dumb, and just because he exaggerated a bit about just how bad it can get doesn't mean he's wrong.

I was hoping for some insight, like a discussion of robustness (I've blown FETs way more easily than I've blown BJTs), or perhaps use in analog applications, or anything else really. But nope. TFA is literally just confirming the findings that it's trying to disprove, while providing absolutely no counter-examples. Somehow feels like par for the course for Hackaday these days...

I use old school jellybean parts all the time, sometimes because it really doesn't matter (driving a relay? meh, throw a BC547 on it, who cares, it's relatively low power anyway), sometimes because it's all I have lying around, but sometimes using ancient devices is actually very dumb, and I wouldn't turn a motor on and off with a BJT these days.

Comment Re:What did you expect to happen? (Score 5, Insightful) 103

It *wasn't* a flaw. He didn't write an exploit, nor is this a security vulnerability. He just wrote a scraper for location metadata that was already there and was intended to be there. There is no vulnerability, just a demonstration of the extent of the data that is already normally, deliberately available. The only mention of "security" is in the Slashdot summary, which is garbage, as usual. The only thing the extension does is take location data that you can already see and plot it on a map.

Comment Re:Answer me this... apk (Score 5, Informative) 44

The answer is that it varies - GPUs are anywhere from mediocre to useless at "normal" crypto.

It depends on whether the particular encryption algorithm/mode in use is parallelizable or not. For example, CBC is not parallelizable - you have to encrypt each block of data serially. GPUs are useless at CBC mode encryption. More modern modes like GCM and XTS are parallelizable to an extent, as you can encrypt multiple blocks at once, but there is still a serial dependency in the process (there is no real way of completely getting rid of all dependencies while keeping the algorithm usefully secure), so you still need to do some pre or post-processing of the data in a serial fashion. And even then, you're limited by bandwidth in/out of the GPU.

Public-key crypto (RSA, DSA, and ECDSA) isn't really parallelizable either as it only deals with small data sizes. And typical hash algorithms like SHA-1 and SHA-256 are also not parallelizable in their construction.

Thing is, CPUs these days have hardware AES encryption acceleration, making this mostly a moot point. GPUs are good at doing the same thing many times in parallel, which is what breaking encryption requires, but not regular usage.

Comment Re:Quantum Computing Required? (Score 2) 294

This paper gives an interesting summary of different assumptions about how detailed a brain simulation needs to be and what they mean for when simulating a brain would be feasible (assuming Moore's Law continues indefinitely, which is obviously not guaranteed). The classical estimates go as late as 2201 depending on what assumptions you accept. See the tables on pages 79-81 for the summary. The quantum estimate is just a question mark; they didn't even bother computing the cost of using classical computers to simulate an entire human brain as a quantum system.

I have a theory that it's impossible to prove anything, but I can't prove it.