I'm still befuddled by what the deal with this picture is. All I can see are the colours that you can confirm by going into any image editing program and checking the pixels: i.e. a light blue and a golden/brownish colour. Now some people are saying you should make some sort of judgement as to what colour the dress really is, when compensating for the bad exposure, to which I can just say: I don't know. There isn't actually enough information in the picture to make that call. I wonder why this isn't a more common answer; when you can't be sure of something "I don't know" is perfectly acceptable as an answer.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
They also killed countless people with nasty practices like bleeding sick patients and prescribing all sorts of other counter-productive remedies, all the while preaching that only God can heal and we are at His mercy. That sort of stubborn attitude born from religious stupidity set back possible advancement in medicine by decades. Any good that did come out of it was entirely despite the religious nonsense, not because of it. Not to mention the church burning so called "witches" who were frequently wise women that actually did help people with their primitive herbal medicines (not that all of their remedies were efficacious either of course).
The reason you don't hear a ton of interesting stuff coming from strong (general) AI research and interest in the field is limited is simple: strong AI is pretty damn useless until you reach the critical point where it matches (or really exceeds) human intelligence. An AI program with the effective intelligence of a worm/mouse/rat/monkey or whatever isn't interesting outside of academia.
I suspect that when strong AI comes around it will be rather sudden for most people, who simply won't see it coming. I doubt it will take long after computers reach the point where they can match the human brain in raw computational power, there is simply too much interest in the field, and honestly human intelligence is really rather unremarkable no matter what some people like to believe.
Get a Kobo, turn off WiFi and drop the ePubs onto the device via USB. That's what I do and no publisher has any control over it whatsoever. Currently reading World Without End this way, which clocks in at nearly 1000 pages and having seen the paper version I'm damn glad I don't have to read it that way. I'd probably sprain my wrists if I did, hefty as it is.
The solution to global warming is simple and we have the technology to slash carbon emissions right now. All it requires is a willingness to replace all fossil fuel fired power plants with nuclear plus some additional capacity, investment in a more robust power grid, and huge subsidies for electric cars combined with slowly increasing taxes on petrol.
Not only would this solve the global warming issue, it will also cut off the money supply to many of the Islamic terrorist organizations and undermine the power of several pathological governments such as Russia and Iran. In fact, it might even end up being cheaper for the U.S. in the long run than fighting constant wars in the Middle East.
I don't get it. I've been using Win8.1 for about 4 months now after my trusty old Vista box gave up the ghost and I haven't used a single metro app, and have only really seen the start screen when I accidentally hit the Windows key on my keyboard and to search for stuff maybe twice. I'm annoyed that they didn't include the option to switch back to the Aero glass window scheme but other than that it's fine. To me it operates the same as Windows always has. It is possible that I'm just weird though, since I never really used the Start Menu much either. All the applications I use are sorted into categories as a toolbar menu on the Taskbar (or pinned to the Taskbar for the most frequently used ones).
I suppose it really depends on personal preference and what you do with your PC, but personally I find XP terrible in comparison to Vista/7/8. I jumped to Vista almost as soon as it was released and used it for many years, and now I'm on 8.1. The biggest thing for me is the user-mode video drivers. The number of system lockups that I had on NT4/Win2k/XP due to video card issues are countless, and they were almost entirely eliminated after that change was made to the operating system. As someone who primarily does 3D graphics programming, this is a huge feature.
It baffles me that some people actually believe that those concerned about global warming think that it will cause the end of humanity. It won't. Even if the most catastrophic predictions come true, not only will life on Earth continue on just fine, but human life will also continue. We are a very adaptive species and even in the case of extreme climate change, parts of the Earth will become more hospitable to humans than they are now. It just happens that people concerned about climate change don't think that 20% or more of the human population being wiped out is an acceptable path to take when reasonable alternatives exist.
It is possible for us to reduce CO2 emissions right now, with minimal economic impact, if we are willing. All it would take would be a concentrated push towards nuclear power generation, coupled with electric vehicles for transport and we could reduce emissions in short order without destroying any economies. Of course instead both Germany and Japan are dialing down their nuclear programs in favour of burning more coal.
Unity has had the ability to create 64bit executables for a while but the editor is still a 32bit program, which can be very limiting if you are developing a large game. A 64bit editor is scheduled for Unity5 and indeed one of the biggest selling points of the new version. There's no release date for Unity5 yet though and I imagine it is at least 6 months out, considering there is at least one more big update to 4.x coming (4.6, which will include the new GUI tools).
And you seem to have missed the part where "running hotter than SandyBridge" applies only to overclocking. Yes, IB is a worse overclocker than SB, but under normal conditions IvyBridge is faster and uses less power than SandyBridge. Remember that overclockers are a tiny portion of the market. IvyBridge isn't the amazing revolutionary chip some people were expecting but it is a successful, evolutionary step forward. Just like most processor generations.
Thing is for every problem you point out with a AI driven car you can point out 5 problems with human drivers. Humans frequently mess up in hazardous conditions, especially if they aren't used to them, meanwhile an AI car is going to be programmed for all possible conditions before it will ever be released into the wild. As for something being wrong with the car, that's what sensors are for. We have to rely on imperfect queues like smell, the AI on the other hand should be plugged straight into the onboard computer and have an excellent overview of the car's health. It might miss corner cases but once again, humans will miss many more. Also humans will often suspect something is wrong and carry on anyway because they can't be bothered to check it out, while the AI can be forced to pull over and demand a fix before carrying on.
There will still be deaths on the road if we switch over to 100% AI controlled traffic but I'll be damned if it won't drop the road toll to a tenth or less than what it is now. That's a ton of lives that will be saved, as well as the added convenience of not having to drive yourself. Of course convincing people to give up control to a machine is going to be a tough sell.
It might help if Android had some sort of built in performance metric similar to the Windows experience index, that can measure the CPU/GPU/memory/etc... and spew out some easy to understand numbers that a user can use to compare to minimum specs listed in the Android store. Something like you need a minimum score of 3, recommended 4, you check your phone, see it only has a 2.2 and skip buying that particular game. No confusing GPU series numbers, memory amounts, CPU Mhz or Ghz or core count numbers, just a simple score the user can compare (or even the device automatically compares and lets the user know). As far as I know nothing like that exists just yet but it would be simple to implement and really solve the problem of different device capabilities for game developers.
While marketing likes to throw things like 2000 cores!!! around, GPU SIMD units really aren't cores. A core implies a complete processing unit, that includes things like a decoder, memory controller, etc... while the shaders in a GPU are barely more than the SIMD co-processing units found in modern CPUs. Of course the language is already muddled through things like the new Bulldozer "cores" that share a single FPU between two cores, which has many people calling their 8-core processors quad-cores with Hyperthreading on steroids. Still, calling a GPU shader unit a "core" is a pretty serious abuse of that word.