Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Slashdot Deals: Deal of the Day - Pay What You Want for the Learn to Code Bundle, includes AngularJS, Python, HTML5, Ruby, and more. ×

Comment Re:Yes, he was arrested [Re:That won't last long.. (Score 1) 780

"arrest" -- from the French arrêter -- meaning "to stop."

If you are stopped and prevented from leaving, you are arrested (stopped.) If you are stopped/detained against your will without either charges or probable cause, the party that stopped you is guilty of false arrest. The police most definitely stopped this person against their will and detained them. That's an arrest. It's legal because they had probable cause a crime had been committed, but they chose not to press charges.

If you think this wasn't an arrest, then as a civilian, try putting cuffs on another random innocent civilian against their will and detaining them. You'd likely be charged with false arrest -- same as if the police did the same thing without probable cause.

Comment Re:Is Windows10 a thing? (Score 1) 191

It's the only rational decision for someone who wishes to stay with Microsoft, at least. If one is using Windows 7 or Windows 8/8.1, then one may as well move to Windows 10 for free and keep the free support and updates coming. The worst parts of Win 10 are being pushed to the older OSes anyway and will likely be required to take more updates.

I enjoy Linux, but the drivers suck -- same for even OSX. Same exact hardware and Windows kills at FPS on games.

I'm slowly making all my Windows machines dual-boot to Linux in the hope that many if not all of them can go to Linux exclusively, but seriously... until AMD and nVidia get their collective butts together to make competent, competitive OpenGL drivers, DirectX is going to mop the floor w/ Linux/OSX on games and future VR tech. I'm anxiously awaiting Wayland or Mir just to replace the ancient X Windows system. I'm starting to think that GNU/Linux might need a complete architectural re-write just to get decent graphics performance. Even Steam can't get great performance out of their linux steamboxes yet -- and their core business is selling and distributing games! I was really hoping Valve/Steam would bring some great tech to Linux to boost things... but alas, no dice yet.

Comment Re:Tried it, couldn't use it (Score 1) 349

There was a fork called GIMPshop that made the UI more like photoshop. Unfortunately, the author abandoned the project after someone else scooped up the website gimpshop.com and made money off of ads and installer/crapware (and donations as well, I think).

Would be nice if someone would fork it again, but there's the rub -- not everyone that cares is a coder and those that are must be working on more rewarding projects... or have lives or something.

Comment Re:That's not their problem (Score 3, Insightful) 313

That's a heck of a spin on the situation. Google paid to be Firefox's default search engine for 10 years. It released the Chrome browser in 2008 and many wondered why it still paid Firefox to be their default search engine when Chrome had the same or higher market share. (answer was it was still worth it!)

When Google was just a search engine, they were fine paying Mozilla for Google to be Firefox's default search engine.
After Google Chrome's market share far exceeded Firefox's, they had their own solid browser platform to push Google as a default search engine. Their strategy changed. They no longer had to pay to get a wide audience, and the best way to get more browsers with Google as default was to push Google Chrome and crush Firefox. I'm sure they would have given something to be Firefox's default, but not as much as Yahoo was offering -- and likely nowhere near the amount they'd been paying prior to the Yahoo offer either.

Yahoo needed a win to boost their search income, and they got it. It was a large increase for Yahoo, but a small loss for Google... and Google is winning firefox users over to Chrome, and helping remaining firefox users to switch their search back to Google.


It made perfect sense for Google to shrug off the tiny, declining value of Firefox search engine users as they expected to pick up market share from those leaving Firefox as well as continuing to pick up market share from those scampering off the sinking IE ship.

Meanwhile, Mozilla is running out of cash and slashing features on Firefox to save on expenses while picking up crap like Pocket to survive. It's truly sad that they're likely getting 90% of their revenue from another dying company (Yahoo) and wasting money on developing phones no one asked for. I fear they may not recover from this death spiral. (over 90% of their revenues from previous years came from Google... and you know that was more money than Yahoo gave them b/c they admit they're slashing expenses and begging for cash).

Comment Re:Is AMD Better Now? (Score 1) 110

I used to love AMD (even used to own shares in the company at one time)... and their graphics cards always had better specs for the price, but... no. Their drivers are crap.

More importantly, AMD and nVidia typically don't make their own graphics cards -- they just sell the chips and give a reference spec to others. Then, they release regular driver updates to the spec, but caution that your card manufacturer may have better drivers and/or not meet the specs so the drivers may not work right. Most card manufacturers quit supporting the boards after only a year or so. That leaves you twisting in the wind hoping someone will release a good driver for your card before they consider it obsolete and stop updating it.

You're really better off with nVidia. They don't tend to care about open sourcing their drivers like AMD, but they do tend to try to fix bugs and release drivers often.

I have a ROG Asus laptop with nVidia, and I often get monthly "game ready" drivers with specific tweaks for upcoming/popular games. Maybe AMD does something similar, but I was impressed with nVidia being on the ball about such stuff. They even have game-specific graphics settings for popular games to tweak.

Comment Re:Talk about drawing a fine line... (Score 5, Insightful) 110

2560 x 1440 is not "low resolution" regardless of how many thousands of dollars you spent 8 years ago on the tech. Age is irrelevant. MacBook Gen 3 13" Retina Displays are 2560 x 1440. Those are being sold THIS YEAR as high-end displays.

Blu Ray is 1920 x 1080
4K is 3840 x 2160, but 4K has not made it out the showroom yet for TV or most monitors.

IBM came out with some spiffy T220/T221 LCD monitors that you could buy way back in 2003 for about $8,500 each that had 3840 x 2400, but that doesn't mean that 3840 x 2400 is "outdated low resolution" simply because one could buy it 12 years ago.

Comment Re:Turned absorption (Score 1) 137

Radar systems already have this capability. There's no need to upgrade anything. Any radar system not using multiple simultaneous frequencies -- especially in the UHF range isn't a modern system.

I say go ahead and spend a fortune building this new super invisible wonder-woman spyplane with expensive electronic skin... it'll be interesting to see how far it gets before it's detected with current tech. My guess would be about half a mile. Maybe less if we upgrade our tech before the plane is built? Anyone taking bets?

Comment Re:So... (Score 1) 137

While this is largely true, it's not so much democracy that has been destroyed as it is that the USA's democratic republic has become corrupted.

All governments are susceptible to corruption, and the USA's system which relies on less than 600 elected officials to dictate national law for the other 300 million Americans just took a while to get this bad. I can't imagine any one person representing half a million people effectively. So, if you allow 600 people to be split among mostly 2 parties and have most funding come from either the rich or large special interest groups, you end up with this mess.

Still, democracy itself is alive and well in other countries -- just not here in the USA whose congress was bought and sold so long ago, no one remembers when it last represented the will of the people.

Comment Re:So... (Score 1) 137

Stealth technology, as hinted at in the article, has been susceptible to UHF radar systems since the 1940s. It's always been a gimmick and it's never worked as advertised since every country in the world has sophisticated enough radar to pick up stealth planes. At best, they reduce the size of the blip on radar a tiny bit depending on where the plane is in relation to the radar dish, but stealth tech changes the shape, and thus the capabilities of the planes as well -- mostly, it makes the planes less maneuverable. Documentaries and interviews with the creators of the F-16 and F-18 fighters dismiss stealth as being practically useless for all intents and purposes. This is especially true for "stealth bombers" -- they fly so high and fast, any decent radar can see them coming because their bottoms are flat and reflect well. If they flew lower to the ground, the stealth shape would reduce their radar footprint, but then they'd be in visual range for other detection.

Better stealth materials won't change anything. As soon as this hits the market, new detection methods will render it just as useless as current stealth tech. It's not like loud, hot, fast-moving objects in the skies with hot, smoky trails of spent fuel can't be seen by satellites, thermal imaging, radar, lidar, and other sensors. Even if you could get the radar signature down to the size of a bird, the radar is going to notice your bird is flying at mach 2 and send an alert.

I read a bit on the active stealth tech. It requires at minimum a thin skin of electronics to tune to and absorb the UHF, but it effectively converts the UHF to heat -- which would make the plane glow in infra-red. It also has to survive constant use at super-sonic speeds known to peel paint from planes and be able to have maintenance work done on it -- for the entire coating on the plane. It's not like it's just a coat of paint, it's tiny electronics all over the skin of the plane wired into the main power. That adds weight and complexity, and sounds like a nightmare for maintenance workers. Perhaps it would work best on small drones as there's less surface area to maintain, but then again, drones already have a very small radar footprint to begin with. Maybe it would make a great coating for a nuclear missile. Every second not detected might count when it comes to retaliation or defense, perhaps.

Frankly, this falls into the "interesting, but not especially useful" bin for warfare. The future of air battles is likely satellites and drones with no need for stealth. Air to ground would be similar.

As for China vs USA, they are economic partners and other than a bit of posturing, over Tibet, the south china sea, etc. they're not going to go to war with one another - ever. Vietnam and Korea were puppet wars between China and USA because neither had the guts to assault each other directly. Any imagined battle between the two would be pointless.

Comment Re:"Never" is a very long time (Score 2) 378

I don't know that Mars is the limit for human space exploration, but it likely is our limit for planetary body colonization given that it's the farthest rocky body planet from the sun capable of being terraformed (within our solar system). Gas giants' moons are small and inhospitable with lots of radiation. It wouldn't make much sense to set up a permanent base there -- or even on an asteroid for that matter. Why would any astronaut even want to visit in person when they can send a probe instead?

I don't think using NEVER is hubris. We can never travel faster than the speed of light due to the fundamental laws of the universe. Wormholes and warp drives are fictional, fantasy ideas that require exotic matter or control over undiscovered gravitons to work. Exotic matter likely does not exist (it's possible gravitons don't exist either), and there's no known way to focus gravitons as they don't interact via other fundamental forces, so FTL travel will likely NEVER happen. FTL travel can also create paradoxes, which is why many conclude it's impossible.

So, without FTL travel, it would take many lifetimes to reach another habitable planet to terraform. Assuming we had the technology to do it, why would the human race choose to endure several lifetimes on a ship with scarce resources and constant peril from radiation and destruction only to hurtle towards some destination that could also end in disaster? I imagine if our sun went red giant, we might decide to move on if Mars were no longer hospitable. Perhaps by then, we could simply seed a new planet with stored genetic material and grow new humans at the destination after terraforming instead of sending live humans on the journey. Any living human would be unlikely to survive the journey anyway -- even if we advanced cryonic suspension / hypersleep substantially.

I think it's hubris to assume "almost anything his possible given another 2000 years" -- wow. Talk about hubris in the faith of what mankind can do. I mean, we understand so much more about the universe now than 200 years ago, but that's the problem. We discovered laws of mechanical motion and electricity/magnetism and exploited them to the fullest. We don't have any new laws or forces to exploit anymore. We only need to figure out dark matter, dark energy, and a unified field theory (assuming one exists.. possibly through string theory) and we're done. No more magic to discover. No more undiscovered laws of nature to exploit for future technologies. Our last big life-changing discovery (other than the higgs boson and meta materials) was superconductors over 100 years ago.

We still have a lot to learn, no doubt -- especially in biology and nano-tech, but no new fundamentals to discover.

Have a look at when some of our "modern" tech came about. Today, we're mostly miniaturizing, combining, and refining tech that we invented many decades if not centuries ago. Until some new fundamental forces pop out of the LHC for physicists to exploit, there will be no FTL drives. Unfortunately, the standard model of particle physics doesn't lend itself to there being any other forces, and the higgs boson was the last missing piece of the puzzle save for perhaps the graviton.

Steam Locomotive - 1804
Telephones - 1876
Incandescent Light Bulbs - 1879
Automobiles - 1885
X-ray machine - 1890
Airplanes - 1903
Television - 1925
Computers - 1822 mechanical, 1946 electrical
Microwave Oven - 1946
LEDs - 1962
Saturn V rocket (moon launch) -- 1969
MRI - 1977
internet - 1982 (with research started as early as 1960 and various government implementations internally used)

Comment Re:Cern isn't even right about the higgs's boson (Score 2) 152

All science is based on statistics, you anonymous moron. There is always uncertainty in experiments and measurements because one can never be certain about anything - even the instruments used to measure observations have inherent uncertainties. Is the ruler you're using precise down to the atomic level? No! Can you be certain your instruments are perfectly calibrated?!?!? No!

This higgs was discovered with 6 sigma accuracy, which is more certain than the precision of manufacturing of most things you can purchase. It's the standard for declaring experimental certainty. If you're 99.9999998 % (which is what six sigma means) certain , there is literally a 0.0000002 % chance that the results were wrong. No one is going to bother to test beyond that, because the possibility of an error is so small it may as well be non-existent.

Comment Re:Might want to take your head out of the sand (Score 5, Informative) 735

If you really want to understand things, you have to understand what you're reading.

The IPCC never said that global warming had paused -- it was merely increasing at a slower rate than expected over about a decade. The general trend was still upwards, and the decade where it trended slightly less steeply was interesting and unexpected, but it still fits with the general overall trendline of the previous decades quite well given the variation in sampling. If you're reading that trend as flat, there is something wrong with your eyes.... or at the very least something wrong with the software you're using to plot a trendline -- even if you only plot the data during the period mentioned by the IPCC.


"The Pause was an idea from a 2013 UN report by the Intergovernmental Panel on Climate Change (IPCC) that concluded the upward global surface temperature trend from 1998 to 2012 was markedly lower than the trend from 1951 to 2012."

It is beyond ridiculous to imply the temperature change was flat for decades given any real data. It may even be premature to describe the temperature change as slowing without more data points to corroborate it wasn't merely an anomaly -- likely brought about through unusual El Nino, La Nina, and other weather patterns which have multiple year cycles.

NOAA investigated this pause/slowdown and used blind studies and multiple statistical methods to prove the cherry-picked period is well within statistical noise and the slowdown or pause is bunk:


Comment Re:Why did they buy based on "cores"? (Score 1) 311

Cores mattered when most systems had only 1 core and upgrading to systems with 2 cores or 2 cores with hyperthreading (4 virtual cores) made a huge difference in performance. That was a long time ago, though. The multiple cores meant less latency -- especially for multimedia tasks as the CPU didn't have to stop what it was doing to schedule time for *random background service* or other apps.

Nowadays, it's as you say -- market droid speak. CPU makers hit a wall with Mhz and needed a new marketing ploy, so they went with cores. Now, even Intel is shying away from that and just going with the cryptic corei3, i5, i7 nonsense -- barely differentiating between the different generations of those..

number of cores over 4 doesn't really matter unless you're doing virtualization, heavy multimedia processing, or just running tons of programs and services simultaneously that are heavy CPU intensive. I can't imagine a 16 or 32 core machine offering any improvements for my daily usage -- at least not unless a killer app like realistic AI becomes the norm.

Comment Re:Pretty Laughable (Score 3, Interesting) 311

I agree. Also, if I understand correctly, that shared FPU can be used either by one CPU core as a 256 bit FPU, or by both simultaneously as 2 independent 128 bit FPUs.

So, shared, but not likely to be a bottleneck.

Also, since when do cores have to include all the "extras" ? I recall when 486's had a math co-processor and there were no mmx instructions or other such multimedia or physics sets. This guy is going to have a really tough time explaining how exactly AMD's architecture doesn't provide exactly the number of cores listed -- even if the architecture has its limitations due to sharing resources.

Marvelous! The super-user's going to boot me! What a finely tuned response to the situation!