People are different. As you get older you'll start to understand this.
People are different. As you get older you'll start to understand this.
What is it that you miss so much from the Bush years? the trillion dollar war searching for nonexistent WMDs? the insanely mismanaged response to Katrina? the disastrous economy with bank bailouts and huge unemployment? the political outing of undercover CIA agents? the debunked evidence of yellow-cake uranium presented as fact to the UN? Abu Ghraib prisoner torture? How about his complete failure to find Osama Bin Laden? Or do you just genuinely miss having the rest of the world hate us?
No thanks. I'll take the current economy with its strong dollar & 5% unemployment, 15+ million people newly with healthcare, no more wars, a nuclear agreement with Iran, regularization of relations with Cuba, legalized same sex marriage, strengthened bank regulations (without hurting the economy), a dead Osama Bin Laden, DADT repeal, etc. BTW: it's also nice to have a president who can speak in complete sentences.
Good luck with Jeb!
If you don't like what Apple is selling, then buy something else.
Most computers never get upgraded. Apple thus made the reasonable tradeoff to sacrifice upgradability to make smaller, simpler, and more durable products. If you think this was a bad tradeoff, then you are free to buy something else. I'm perfectly happy with non-upgradable HW and I consider myself an informed shopper, not a gullible blind sheep. Different people care about different things.
Maybe it's time to stop caring so much how other people choose to spend their money.
your point is that Zuckerberg has better low-level technical chops than Jobs did. Sure. Bill Gates is probably better than both of them. Very possibly many [if not most] people here on slashdot are better than all three.
I think limiting the label "inventor" to only people who solder or code is pretty narrow and would likely exclude a lot of people who are technology visionaries and considered "inventors" in the zeitgeist. Jobs was certainly a tech visionary.
Facebook is a website. It uses existing web technologies and runs on existing hardware. It isn't the first social network and it's hard to even identify any Facebook features that didn't exist on earlier platforms. It is the very definition of taking existing technologies and making it usable--and very arguably is just a "right place, right time" success story.
I don't see how you can call Facebook a major "invention" while saying that the company that launched four separate computing revolutions (each leading to multi-billion dollar industries whereas previous contenders had failed miserably) has never invented anything:
1. graphical personal computing revolution: the original Mac
2. portable music player revolution: the iPod
3. smartphone revolution: the iPhone
4. table revolution: the iPad
That's why you are struggling to figure out who has invented what. Let's try a thought experiment. Name a single person who has "invented" anything ever. I will trivially show you how that invention was based on a fairly obvious composition of existing technologies. You will say "if it was so obvious, then how come everyone else didn't do it?". Then you'll get sucked into a vortex of irony when you realize that you've just defended Steve Jobs.
Every invention has always involved a composition of existing technologies, and in nearly every case that composition looks obvious in hindsight. The world has long since internalized the notion that "invention" means "combining existing technologies in a way that has value to people". Only on Slashdot do people labor under the delusion that "invention" means "creating a brand new technology out of thin air".
This is why people on Slashdot don't understand patents and don't understand why some companies are very successful despite having not "invented" anything. Basically, the Slashdot community doesn't understand what the word "invention" means.
I take no small amount of delight that bitcoin zealots are painfully repeating the lessons that the rest of the world learned over several centuries; but this time compressed into a couple decades and punctuated by the laughter of everyone else.
I guess it turns out that there are some advantages to government backed [inflationary] currencies with strong banking regulations & government backed insurance on individual accounts. But yeah, no, please continue "sticking it to the establishment" with your 133t cyber currency.
Congratulations for tricking someone into giving you money. Good luck with your impending disaster.
The best thing about BYOD is that it wrestles some measure of control away from annoying sysadmins like you who think the only purpose of a computer is to continuously run software updates and anti-virus scans--and of course justify increasing the IT budget (again) and internally billing a couple grand per user per year for a 'managed laptop'. Camp 1 needs to be drug into the street and shot.
First, stop pretending that everything needs to be kept as secure as the nuclear codes.
For the stuff that needs some security. Let people use their own devices and connect through thin client software (i.e., VNC, Citrix, rdesktop). Keep the stuff that needs to be secure in carefully managed server land and stop pretending that you can extend the firewall to include managed laptops so long as you continuously run 14 layers of spy software on them.
The primary benefit of caches for HPC applications is *bandwidth filtering*. You can have much higher bandwidth to your cache (TB/s, pretty easily) than you can ever get to off-chip--and it is substantially lower power. It requires blocking your application to have a working set that fits in cache.
He's pulling out quotes from Cray (I used to work there) about how caches just get in the way--and they did, 30 years ago when there were very few HPC applications whose working set could fit in cache. It's a very different world nowadays.
Sometimes skipping college doesn't make you a genius, sometimes it just means you are doomed to repeat 50 years worth of mistakes in a well developed field.
Please explain to me simply how you get 10x in compute efficiency over GPUs--these chips are already fairly optimal at general purpose flops per watt because they run at low voltage and fill up the die with arithmetic.
GPUs have excellent memory bandwidth to their video RAM (GDDR*), they have poor IO latency & bandwidth (PCIe limited) which is the main reason they don't scale well.
We've heard the VLIW "we just need better compilers" line several times before.
Thus far this sounds like a truly excellent high school science fair project, or a slightly above average college engineering project. It is miles away from passing an industrial smell test.
"Virtual Memory translation and paging are two of the worst decisions in computing history"
"Introduction of hardware managed caching is what I consider 'The beginning of the end'"
These comments belie a fairly child-like understanding of computer architecture.
All failures, which is exactly why Apple is teetering on the edge of bankruptcy right now.
What are you talking about?
Nothing succeeds like the appearance of success. -- Christopher Lascl