Forgot your password?
typodupeerror

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45912409) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
To be fair, the definition of "well" I intend isn't an arbitrary X/Y value. There's already very well defined numbers for the hardware which currently runs the algorithm. To transfer "Well" to custom hardware would be somewhere in the vicinity of: less than the original general purpose CPU by enough that it justifies the design effort involved and doesn't cost MORE to manufacture. All engineering decisions are trade-offs, and if the trade-off isn't worth the effort and resource cost you don't do it. For a transfer effort to go "Well" means at the end of the day you come out ahead somewhere.

If you have to spend 3 million dollars on custom hardware development just to get performance parity with a COTS general purpose CPU... you'd be hard pressed to call that "well" by any measure. This is what is implied by the setup of the original Ask Slashdot question, asking an engineering question about feasibility and cost.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45910513) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
Not really. The biggest conversion issues I deal with (when converting algorithms to hardware) are related to how software treats RAM vs how hardware treats RAM. They are fundamentally different methods of operation. In software RAM is cheap/free, so it is preferred over CPU cycles. In hardware, the processing is cheaper (in general) and RAM is more expensive.

Buffering and holding a megabyte of data between each stage of processing is natural and very easy for software. But in hardware this is a very inefficient way to do things. Converting from one method to the other can be quite difficult depending on the algorithm.

Comment: Re: Verilog (Score 1) 365

by Asmodae (#45903655) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
Nice point - For is used for iteration in software, and For Generate in hardware is used to generate new instantiations. The similarity in words/syntax is a dangerous trap. The closest thing in software is like putting malloc or new() in a loop. It's a great convention for when you need many similar bits of hardware. Completely wrong for iteration.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45903589) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?

The original wording was "Some C algorithms may never transfer well into a hardware implementation." At least in my mind the transfer process is what might not go well... not how the final product may or may not run. Having some experience here I understood the transfer to be where the work/expense would be. And those are ultimately key factors you would use to base your decision about whether or not to go ahead and make the conversion.

I don't think we disagree on content, just on what might have been meant by Andy's post. Especially considering exactly what you said for the reasons you said, making claims of impossibility would indeed be silly.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45903415) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?

Unless the algorithm requires all those special instructions and monster ram to run.... at which point your custom hardware looks very much like the CPU and system it is intended to replace, and definitely not cheaper unless you're selling a whole lot of them. Reliable hardware is expensive to build even when it's a simple design iterating on previously known good hardware. Starting from scratch on raw silicon takes millions of dollars, just for your first chip lot, not to mention all the man hours to get it there and subsequent revisions. There are lots of algorithms that don't make any sense (from a cost vs efficiency standpoint) to port to custom hardware. That's the whole reason the generic CPU exists in the first place.

I guess I'm disagreeing with your definition of better. If it's faster but costs too much for anyone to actually buy isn't better.

Comment: Re:Verilog (Score 1) 365

by Asmodae (#45901951) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
He didn't say "may not transfer at all", he said "may not transfer well". Also remember that an algorithm isn't just running on any old bit of hardware it's running on a modern CPU with lots of special instructions with a gigantic RAM attached to it and potentially some other peripherals for special functions. Hardware RNG, etc. It might very well not be reasonable to try to convert all this to a custom FPGA/ASIC for the cost involved.

Comment: Re:Matlab has a solution for this, but $$$ (Score 2) 365

by Asmodae (#45901717) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
They have a tool that can do this, I don't know if I'd call it a 'solution' just yet though. We've just finished ripping out all the 'solution' for our project because we wanted a device that was actually small enough (and thus cheap enough) to be able to sell.

It takes input designed to be hardware and makes good hardware. It takes input designed to be software and makes shit hardware. It also doesn't handle version control very well, you need proprietary tools to even VIEW the design files... and the output which actually describes the hardware (vhdl) is so obfuscated as to be nearly illegible. The build times are also 4-5 times longer than they need to be, so it takes a whole day to place and route the designs output by this tool. Unless you're building something trivial I wouldn't advise depending on mathworks/simulink tools for a solution.

Comment: Re:Difficult, but... (Score 1) 365

by Asmodae (#45901667) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?

VHDL is basically programming

Sure, the same way software is basically just english and letters and numbers and if you understand those you can do most any software yourself! /sarc.
VHDL is code, but after cleaning up after software people who think they can write VHDL, it's not the same thing at all. The key statement is

Sure, you'll need to develop a fair understanding of the hardware

This is by no means a light or trivial task. There's even entire university degrees dedicated to it. ;) But if you have all THAT, then sure writing hardware code is a snap! In all seriousness the above statement basically says it's easy if you have the skill set already.

Just don't make the mistake of thinking that if you understand BOTH hardware and software that they are equivalent, or that everyone else shares in your expanded understanding. I've seen programs fail because people try to treat hardware like software simply because they're both captured with some text. It's a dangerous viewpoint if you want your project to succeed.

Comment: I've done this before (Score 4, Informative) 365

by Asmodae (#45901555) Attached to: Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?

There's been several people who suggested using a high-level synthesis tool to convert your software (c/c++) directly to HDL (verilog/VHDL) of some kind. This can work and I've been on this task and seen it's output before. The catch is; unless that software was expressly and purpose written to describe hardware (by someone who understands that hardware and it's limitations and how that particular converter works), it almost always makes awful and extraordinarily inefficient hardware.

Case in point - we had one algorithm developed in Simulink/Matlab that needed to end up in an FPGA. After 'pushing the button' and letting the tool generate the HDL, it consumed not just 1 but about 4 FPGAs worth of logic gates, RAMs, and registers. Needless to say the hardware platform only had one FPGA and a good portion of it was already dedicated to platform tasks so only about 20% was available for the algorithm. We got it working after basically re-implementing the algorithm with the goal of hardware in mind. The generation tool's output was 20 times worse than what was even feasible. If you're doing an ASIC you can just throw a crap-load of extra silicon at it, but that gets expensive very quickly. Plus closing timing on that will be a nightmare.

My job recently has been to go through and take algorithms written by very smart people (but oriented to software) and re-implement them so they can fit on reasonably sized FPGAs. It can be a long task sometimes and there's no push-button solution for getting something good, fast, cheap. Techies usually say you can pick two during the design process, but when converting from software to hardware you usually only get one.

Granted this all varies a lot and depends heavily on the specifics of the algorithm in question. But the most likely way to get a reasonable estimate is going to be to explain the algorithm in detail to an ASIC/FPGA engineer and let them work up a prelim architecture and estimate. The high-level synthesis push-button tools will give you a number but it probably won't be something people actually want to build/sell or buy.

Comment: Re:Instagram didn't replace Kodak (Score 1) 674

by Asmodae (#45891989) Attached to: The Internet's Network Efficiencies Are Destroying the Middle Class
Pretty much correct:

Here's a comment I made a while back about this same situation:

If I recall correctly it had more to do with some arbitrary and insane insistence on 'Consumer Imaging' being the business focus, which is why you got cheap consumer cameras (easy share), printer docs (with attempts to cash in on printer paper consumables), but little pro-sumer stuff, and the occasional/rare super high-end imagers/gear (like those used in telescopes, etc).

This is also why they sold off/spun off their profitable medical imaging groups, chemicals group, and they've tried to get rid of their profitable Document Imaging group (high-end, high-speed document scanners) several times. They've been constantly trying to push themselves into the most difficult and price-competitive market possible, cheapo consumer cameras. I think the ultimate goal was to maintain some kind of grasp of the photo printing business as their cash cow with consumable manufacturing/selling. To be fair, they still do a good job printing pictures, but people don't really want/need to do that anymore with rare exceptions. And people that still do prints do it in-house or have local labs that do the work.

Kodak's management has always been married to consumables and services as encapsulated by the mantra "You push the button, we do the rest." It's like some creepy love affair with George Eastman. Most of the outright wrong directions Kodak has taken can be traced back to trying to that philosophy. Being from the Rochester area made Kodak's fall a bit sad to watch, but it was still very predictable.

To those people saying Kodak wasn't a camera company; Kodak made the first and best professional digital cameras, as well as medium/large format digital camera backs and other digital sensors. It was management decisions not to aggressively pursue that tech in the consumer space with gear that didn't treat the consumer like a moron. Not to mention all the custom designed software/drivers using non-standard GUI interfaces which were expensive to build from scratch and horrendous to use. Every Kodak made product or service was focused on consumables and draining the customer of as much cash as possible and not about providing as much value as possible.

Incidentally Eastman Chemical (spun off several years ago) seems to be doing just fine.

Comment: Re:congrats guys and gals (Score 1) 293

Are you serious? These guys are in damage control now that their complicit behaviour towards the NSA has been revealed. They are protecting their profits and that is it.

IF that's all it is, then that means sufficiently many of their customers care about privacy to noticeably affect their profits. How is THAT not at least a little bit of good news? Up till now I assumed nobody but a few hardcore geeks/techs cared at all. Maybe all this public discussion is bearing some fruit after all?

Comment: Re:Here's the game changer... (Score 1) 108

by Asmodae (#45603527) Attached to: Valve Joins the Linux Foundation
Elite Dangerous may have raised 2.5 million on the kickstarter, but Star Citizen was 6 million during the kick starter campaign. Funding has continued and Star Citizen is now up to 34 million dollars at the time of this writing and climbing fast. I'm sure you've heard of Wing Commander? Freelancer? Privateer? All Chris Roberts games, who is leading Star Citizen. Might be time to pay a little attention.

Comment: Re:Proprietary on top of linux = no control for us (Score 4, Insightful) 271

by Asmodae (#44951193) Attached to: Valve Announces Hardware Beta Test For 'Steam Machine'

Think carefully about those statements. Here are some possible consequences of SteamMachine:

Failure - Status quo is maintained.

Success (even moderate success) - LINUX Gains a huge user base dedicated to gaming. The calculus of game developers and publishers with regards to LINUX development and Linux ports does a complete 180. Native support for LINUX games becomes something publishers might actually consider as worthwhile instead of "WTF is LINUX?".

Success and Valve turns evil - Games will be made to natively support LINUX so they run on the Steam console hardware platform of the day. DRM can and will be circumvented as always, but now they'll run on LINUX instead of Windows.

Consultants are mystical people who ask a company for a number and then give it back to them.

Working...