My gut instinct says that very rarely do people in the public eye follow totally altruistic agendas, particularly when it comes to issues like this that have little to do with the common good. If you dig deep enough you can find special interest trails that more often than not uncover these people's true motivators. Just follow the money.
You're exactly right! I've done tons of "exploratory" coding over the years myself, either using some new techniques or new products, just for the fun of it or to learn something new. But that would always be on small, low visibility projects where the consequences of potential poor performance or other issues would be insignificant. To tread new ground on something so big with national visibility is foolish. You'd want the most well established and known reliable and performant tools and techniques possible. I played around a bit with these object based databases in the early days, and for some uses such as simple dictionary type access to a serialized object they're good. But the query syntax is almost always XPath oriented and less than optimal for complex joins such as you would find in massive systems like this. I guess they've learned their lesson now.
I was mainly addressing your jackass "you obviously don't..." response to the guy above. As you mentioned the article contains no technical details, so neither you nor I have any basis for a great debate. Fact is only that various technologies DO exist to dramatically increase the effective throughput of optical drives, the question is only if any of them will be used in this particular product.
You obviously don't understand SATA3--that's 6GBits/s, which given the encoding used is 600MBytes/s. While I haven't seen the technical details of these 300GB disks, historically the bit density has increased in both dimensions with each new generation of technology. I would also assume that to reach such high capacities they will go multi-layer, which opens the door for parallel reading of all layers. Furthermore, there is a lot of research into multi-track reading and writing technology, where the laser is split into multiple beams which are read back by a linear sensor array. That way a number of adjacent tracks can be read in a single revolution and reconstructed in a buffer into a single data stream that's effectively N times faster, N being the number of tracks. That, plus the higher bit density and multiple layers could come close to saturating the SATA3 bus. I'm sure that the engineers working on 300GB disks haven't sheepishly missed the data transfer bottleneck elephant in the room.
Ironic reversal indeed! Turns out Sun were right when they said that the network is the computer, although not quite for the reasons they thought.
Wait, let me check...nope, no relation.
To put it another way, as time goes on we're running out of simple ways of arranging magnets in novel configurations to create some new machines never before seen, or variations on this metaphor. To create truly new things requires orders of magnitude more work and knowledge.Look at battery technology, where the most groundbreaking improvement of the last century has been the Li-Ion battery, and that's no panacea either. Or how about the PEM fuel cell, potentially the holy grail of electric power generation? Scientists have been tinkering for decades to come up with a better and more eficient membrane and cheaper catalysts, and considering the potential payoff yet still no revolutionary break-throughs this has to be considered seriously difficult stuff. Oh, and how about flat screen TVs? The stuff of science fiction for over a century they are finally here, common and cheap and almost mundane, and they replaced the CRT faster than anybody could have predicted, in a few short years. Yet nobody is blown away by them because they weren't suddenly revealed at some big event with pomp and circumstance. We witnessed their excruciatingly slow gestation from the crappy LCD watches of the early 80s to the passive monochrome LCDs of the first generations of laptops to the much, much better active matrix LCDs to the first ridiculously expensive TV sets, to finally today's $200 Walmart special. After over thirty years of familiarity they're just not that flash anymore. Yet if you brought someone from the 60s here today (s)he'd be mesmerized by this miraculous new technology.
Holy crap, here's the actual extract from http://www.stallman.org/archives/2006-may-aug.html#05, specifically the entry at 05 June 2006:
"I am skeptical of the claim that voluntarily pedophilia harms children. The arguments that it causes harm seem to be based on cases which aren't voluntary, which are then stretched by parents who are horrified by the idea that their little baby is maturing."
So unless his domain was hacked and these aren't his actual views, let me just sat WOW!
Incidentally, the parent poster presents some pretty widely held and well founded views, and even backs them up with references to the actual words of the person he attacks, and he still gets modded down? Welcome to
Let's revisit that:
"...with the purpose of creating an open-source gun [...] that can be downloaded from the internet and printed out."
Right, because what's really holding back modern society is this frustrating lack of weapons availability. I can hardly wait for 3D nano printers so EVERYBODY can download their own Ebola virus from the internet and print it at home!
Umm, where do you think it came from--surely not the back of a truck?! That's the power cell for his animatronics!
Actually from all my observations their Achilles heel is their slow-ass web services. Both the desktop and mobile web apps initially load quickly enough, it's the subsequent data pulls via web services to refresh GUI fragments that lag badly. Haven't checked the granularity of their services, but on mobile fine granularity is particularly bad because of the typically atrocious latency (although LTE is heaps better than HSPA). Regardless, even in the desktop site you can see these ws call latencies, when you click to view a detail and sometimes it pops up instantly, sometimes you get the spinner cycling endlessly with apparently nothing happening and you eventually have to F5 the site. Their ws API has always sucked lemons and that's biting them in the ass particularly badly with mobile.
Famous statement attributed in various forms to various people throughout history. Duell's actual statement (provided that was attributed correctly) was the exact opposite of this.
Bless their hearts..
You're right of course, in terms of the GUI the Mac was better thought out and more flexible. Frankly I preferred its look and feel to the Amiga until 2.0 arrived. I also remember its movable memory architecture where each memory access happened relative to a dynamic offset that the OS could change at any point, which allowed it to move chunks of memory around as it saw fit without upsetting their owner apps. I'm not really sure the actual design of the OS had has much to do with the success of the Mac though as did the marketing machine behind it. Commodore was self destructive in every way imaginable, they couldn't recognize a good thing if it bit them in the ass. They could have easily taken the Mac OS and turned it into the same failure as the Amiga if given the chance. We loved the Amiga despite Commodore, while many people love the Mac because of Apple. It was a great machine for its time, the Linux of the day for us renegades back then. But when the writing was on the wall that it was going nowhere, around 1992, I swallowed my pride and bought a PC. While Windows 3.1 did suck majorly, with Windows 95 it became good enough to where one could finally concentrate on the apps rather than fight the OS. Nowadays it seems the OS is becoming less and less relevant with so much moving online. I routinely swap between Windows and Ubuntu, and may even give the Hackintosh a go. But this new wannabe Amiga? Nah...
And gamers, don't forget gamers! Because in the 80s the Amiga wasn't just an awesome development platform but also by far the best gaming platform. No telling how many of today's avid gaming Gen-Xers started on the Amiga. For me it was the other way around though--I got to the Amiga because of the games and flash and it dragged me tooth and nails into the programming world, making my future career all but inevitable. I will confess that before getting the Amiga I was actually seriously in love with the Atari ST because it LOOKED like a more polished product with its crisp monochrome screen and no-nonsense and clean GUI, plus of course that built-in MIDI interface that beckoned to a budding musician. But after endless pouring over magazine articles and learning more about the technology behind the two platforms the choice was clear. While both the ST and the Mac (which I sure as heck couldn't afford anyway) had more polished UIs, their underlying OSs were quite primitive compared to the Amiga. Few non-technical people realized just what a quantum leap that OS was. It really took until Windows 95 and its new 32-bit code base (disregarding its legacy 16-bit support crap) for a mainstream OS to replicate what the Amiga had since 1985. The Mac only got there with OS X. What's most amazing though is that the original Amiga 1000 did all that with a paltry 256K of RAM. It's fascinating to look under the covers and see the memory efficient design of the ROM, with much of the OS executing in place in ROM, the shared linked-list based data structures that were used for anything from memory maps to sprites to sound maps. You could so easily see the various subsystems independently at work when the OS went belly-up with a Guru Meditation while the sprites still merrily moved across the screen and the sound still played (provided those data structures hadn't been trashed). Oh, and who can forget the Amiga tracker scene of the day, with all those cool MODs? Anyone remember the Axel F one? Fire off some of those for your friends and they'd be all gobsmacked--wow, a computer can do THAT?!