By some accounts Jack Tramiel eventually doomed himself when he took over Atari. Atari's suppliers wanted to renegotiate their contracts when Jack came on board because they had been so burned by Commodore that they didn't want any wiggle room in the new contracts.
I once contracted for a medium-small company that was under contract with Disney to supply services. Disney was a royal pain in the ass to this company in that they were super picky, but the company used them for bragging rights when attempting to sign on other companies. Eventually they dropped Disney when it was realized the bragging rights were not worth the abuse.
I bet he's going to make the Middle East stable...
The best way to deal with the M.E. is to bud out. Our tinkering has made it worse far more often than better. I wish the Office of the President was split into a domestic prez and a foreign policy prez.
I'd vote Ron Paul for foreign policy prez in the heartbeat. I just don't like his domestic plans.
It's what the next Duke Nukem will be written in.
Then name a language Cloud++ and it will fly off the shelves (even if it's a steaming mass of unicorn farts).
They just ask for 15 years of experience in R&R regardless....and get it (per claims).
Several coders once told me, "you gotta learn to lie better" when I was struggling to find a new gig. Dealing with HR is a game.
C# - As a language, ignoring MS's platform-lockage API games, it seems to tick off the fewest. And one can use its brother, VB.Net, if they don't like the punctuation-heavy style and/or prefer type descriptors on the right of variable names.
If you master its "different" framework, perhaps you are right. But the problem is that the learning curve is too high. A master swordsman can probably beat a generic cop with a gun in a urban environment. However, it takes a heck of a lot of training to reach that point. Cops with guns are cheaper and easier to find and train.
(It's fine for light-duty "gluing and scripting", but people are trying to do OS-like things with it.)
Ruby will probably fail to go mainstream for the same reason Lisp has. It's wonderfully flexible in that it's almost a meta language that allows you to shape your "language" into just about any construct you want.
The downside is that everybody thinks different, and shaping a language to fit your head de-fits it for other heads. Standards are often preferred because they provide consistency between individuals and teams even when they don't perfectly fit a specific situation in terms of parsimony and compactness of expression.
The lesson of the market is that inter- and and intra-team communication trumps parsimony economically, in most cases.
But it's a bear to shut off
If you need a 20-30ms initial access time, and then a constant transfer rate of 20-50MB/sec, that makes tape completely useless as it can't fulfill initial access time, and it makes SSD pointless as it overdelivers without added value for everything above that. IE, for bulk data that gets streamed, such as basically any large datasets like video, price per TB is the factor that overshadows anything else.
IOPS is of course hugely important for the average utter crap database written by an intern that devolves into 512byte random access read/write patterns, which seems to be what 'enterprise solution' means these days. But the disasterous consequences of that usually keep the data sets into whatever fits on a comparatively small and cheap SSD as anything beyond basically using processor L1 cache will make the application too slow to use.
Indeed. And when reaching larger capacities, it's quite likely that you're dealing with largely sequentially accessed streamed data, ie, video, where you have a maximum needed transfer rate which the HDD is entirely capable of fulfilling which means the SSD gives zero added value for the price premium.
Well, the Samsung 3.2 TB drive claims that you can read/write the entire drive every day for five years before failure. It's my understanding that at one point, SSDs were notorious for gradually declining over time, but that today's generation of SSDs basically has reliability out the wazoo. I can't quote you stats on it, but anecdotally, I've had a couple of SSDs in my computer for several years now, I leave it on 24x7, and I've never had a problem.
The laws of probability suggest humanity is doomed, at least in our current form. We as human ponderers are roughly a sampling of the average human pondering their existence. About 60 billion humans have come before us, which suggests roughly just 60 billion will come after us since we are most likely to be in the middle of the pack rather than near the beginning or the end of the pack. (Roughly the Copernican principle as applied to human population density and time.)
If most of our future is to be Borg-like, then we'd more likely be Borgs contemplating our existence, not humans. But us (here now) being at such a coincidental position would be violating the Copernican principle. Either way, we are either doomed to end soon or become Borg-like, neither is a pleasant thought.
Of course coincidences do happen and we may indeed coincidentally be at the start of the human expansion curve; but if I were in Vegas, I wouldn't bet on it. We're doomed, guys.