With a stable currency you're unlikely to lose money that way
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
With a stable currency you're unlikely to lose money that way
I agree with some of what you're saying, but I think the biggest thing holding back Linux on the Desktop is all the duplicity of development effort carried out on the hundreds of various distributions. If we want Linux to be a competitor, we need to stop forking. There's clearly enough development taking place to support a competitive operating system, but we're spreading that effort too thin by trying to maintain and improve a dozen window managers, two dozen email clients, a hundred music players, and a few dozen package managers, etc. I get it, everyone thinks they have a better idea and 1 in 10 of those actually is better. But just think about how great an OS we could have at this point if all of that effort had been put into perfecting just one or two products. If we limited ourselves to one or two distros then it would be easier for all of that development work to be put into creating a standard set of gui tools for configuration settings so that the use doesn't have to open a terminal prompt to change something. If we focused our collective efforts on one set of PDF viewers / music players / IM clients, etc, we could pull together an OS that is AS polished as OS X is.
I install Linux on a "desktop" every 2 years or so just to see how things have improved and every time I do that, even with the same distro, there have been numerous changes but very few improvements. There's a new picture manager app, or a new default email client, but the feature sets haven't improved much. If we can get to that level of polish and consistency, then we'll open that door to more market penetration which will then hopefully lead to solutions for the other 2 problems:
1) Drivers (mentioned already on this page)
2) Commercial application support (also already mentioned)
Both of those are important, but their solution becomes easier if we can create a standard linux distribution that they can focus their efforts around.
Agreed. There are also OpenMP implementations for doing your parallel processing. If you're running on a Xeon processor then I would SERIOUSLY consider Intel's linux fortran compiler as it will provide the best performance by far.
I understand that, I'm merely adding that the USB standard does not alleviate 'cable headaches'. I still have to maintain a stockpile of USB cables to support USB devices despite the existence of a standard. Now if Apple devices required a proprietary connector on the computer or charger side I could see people getting more up in arms about this, but they don't, it's still USB-A. Having to bring that cable with me isn't really inconvenient since I would just be bringing a micro-USB cable with me instead if I had an Android phone. It's still a cable. And I see how you could say that you're more likely to be able to find someone with a spare micro-USB cable for you to borrow while on the road, but why on earth would you rely on that and not bring one? Secondly, with nearly 1 in 2 smartphones sold in the US being an Apple iPhone, I'm probably going to be able to find a cable somewhere if it came down to that. (Yes, I realize that isn't representative of global market share).
While I agree that the point of USB was to remove hassle, I think they failed monumentally at it. I have a ton of USB cables around here and you know why? Because they offered a variety of USB port sizes, for what purpose I'm not sure. Type A, Type B, mini-A, mini-B, micro-A, micro-B and now the USB-3.0 plugs. Compound that with female and male (yes, I have some NAS drives that have male ports for some unknown reason. So now, just to support USB, I have to keep 3-6 cables lying around. So is USB really the ideal solution to all of device connectivity woes?
Why would it be "silly"? If the point of benchmarking is to compare "like" things, and the same game is written for both ecosystems, why wouldn't the concerned consumer want to know that game X runs 20% faster on device Y, regardless of whether device Y is android or iOS? The only people concerned with these benchmarks must be looking for that 5% difference. So if that's what they want, then knowing that another platform gets them that 5% should be just as important as knowing the performance spread among devices of the same OS.
I completely agree. They are late, they've been 2-4 years behind the curve for the last half decade at least. But I don't believe they are beyond saving. With Windows XP support ending many business will be forced to upgrade to Windows 7/8 which should produce a healthy revenue stream for Microsoft. You're right that they need to focus intelligently on do-all mobile device (like the Ubuntu Edge smartphone). Something that allows today's average enterprise worker to dock and work effectively at work, then jump on the train or go to a meeting and be productive all on one device. A failed tablet doesn't spell the end, it took google many years for Android to take off in the tablet space and they're still gaining momentum. If it weren't for how inexpensive android phones were, they'd probably still be playing catchup as well. The Surface came out too late and at too high of a cost to compete in the market space, especially when you consider the lackluster app ecosystem that was backing it. A $600 tablet that has a lower resolution and slower processor than my phone just can't cut it in this market. Here's to hoping round two is more impressive, cause I love good competition.
The rules are changing.
Check they news! http://www.engadget.com/2013/08/09/xbox-one-home-gold/ One gold account per Xbox one will allow everyone to access the services.
Microsoft announced today that you'll only need one gold account per device (and it can actually be shared between a 360 and Xbox One).
I get it, you're pissed. You (the general population posting in these forums) hate Microsoft, this is a chance to try and get others to rally behind you. You claim that this is the feature/policy that broke the camels back and now you definitely will not be buying an xbox ever again. To you, charging for video streaming is just one more way that "the man" is trying to stick it to you. Last time it was Netflix, those bastards.
I tend to approach it from the other perspective. For the last 6 years I've been getting a great online experience. A reliable multiplayer utopia where I can have persistent chat rooms independent of what activity my friends are currently engaged in (ps3? no), access to countless media streaming services like netflix, hbo go, xfinity, vevo, syfy, espn, mlb.tv, etc., and it all costs me about $3 / mo (I don't know why people would pay full retail which is $5 / mo when the memberships are regularly on sale from Newegg and the like for ~$37 online). Outrageous, right? Well I don't think so. I think that's a hell of a deal for what I get. The PS3 fans are right in stating that they can use their consoles without PS+ to do this stuff but I know they're lying through their teeth OR they just don't know any better cause they've never tried XBLG. The PS chat system is HORRIBLE and you have what, 4 or 5 video streaming services and no audio services outside of Sony's own personal offerings? With channels like VEVO on the xbox I have 24 hours a day of music video streaming, on demand, any artist I want, my own personal MTV. Prefer music in the background, fire up Last.fm. Video rentals? Got those too from more sources than the PS3 can touch.
So while you see this as an affront to your console gaming experience, I see it as one MORE feature that my $3 / mo was getting me. Now I can stream video of me getting tea-bagged to all my friends, damn life is sweet.
Three layers of security doesn't really relate to pressure at all. Temperature is much more critical as it relates to fuel integrity. Fuel is encased in Zinc and during emergency situations the main objective is to keep the temperature of the fuel below the melting temperature of the zinc alloy so that it remains contained. People should not be "scared", they should be educated. Secondly, we don't need thorium reactors to increase safety. The current generation of plants being designed and approved have many passive safety features and there are many more coming to market over the next decade which are entirely passive yet still based on uranium fuel cycles. I'd love to see fusion technology as much as anyone else, but as a commercial technology we're still a couple decades off.
Not true, actually a good portion of the domestic (US) nuclear fuel is coming from both domestic and foreign (Russia) weapon dismantling programs.
Politics makes nuclear fission expensive, not the technology. The technology is well understood, the fuel is abundant and inexpensive. The problem is that 1) the industry is so over-regulated due to public fear of catastrophe that the plants have 3+ layers of safety and redundancy at every level which is expensive and 2) the fear of terrorists obtaining weapons-grade nuclear material is considered to be high enough that we throw away a LOT of energy rich fuel to avoid getting in the situation where that fuel can be used to make a bomb.
Wikipedia to the rescue: http://en.wikipedia.org/wiki/Cattle_grid