Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:Inscrutable behaviour (Score 1) 409

I would also find it creepy if I frequently posted on topics I barely knew anything about, acting like an expert making definitive statements which I know inside are at best guesses.

See, I only post on topics that I know for certain that I am well above average in knowledge of, and I don't make definitive statements that arent true. I carefully avoid making definitive statements when I am guessing, electing instead to with honesty and integrity indicate that I am guessing when I am doing so.

I guess by now you've realized that I have actually looked at some of your past posts. I've identified your pattern: act like an expert even when you know for certain that you aren't one.

Comment Re:ELI5 (Score 1) 81

On the contrary simulated annealing fell out of common usage due to other stochastic search methods being better at solving many problems types.

For instance the Extended Compact Genetic Algorithm converges much faster, and dont let its name fool you its not a genetic algorithm as the name Compact Genetic Algorithm is derived not from the technique, but instead the name is derived from the space it searches which is exactly equivalent to a simple genetic algorithm with a crossover probability of 0.5. The Compact Genetic Algorithms is instead an estimation of distribution algorithm, and the Extended version detects and leverages the dependencies between different elements of the solution vector in a theoretically optimal (information theory) way, which gives it an advantage over algorithms that don't (which includes Simulated Annealing, which is why it fell out of favor.)

Annealing is still used for problem sets where there isnt a lot of dependencies within the solution vector.

Some of the d-wave haters have moved onto the argument that the system isnt faster than a conventional one when the conventional one runs a "better" algorithm .. see the big paragraph above. "Better" means searches a different solution space and therefore cannot solve all the same problems.

Comment Re:ELI5 (Score -1, Flamebait) 81

They downvoted him because his logic is wrong.

Then a bunch of fucks upvoted him because you linked to something and claimed it justified his argument, when it doesn't.

You dont compare benchmarks of different hardware using different algorithms. You compare benchmarks of the same algorithm.

First these fucks said d-wave wasnt doing any quantum stuff.
Then these fucks said it was slower than conventional hardware.
Now these fucks say its still slower than conventional hardware if you use a different algorithms that wont solve the same set of problems...

As someone else noted.. Google, NASA, etc must be complete idiots for not bowing to the clearly rational flying goalpost these fucks swing around.

Comment Re:Inscrutable behaviour (Score 1) 409

I thought we were talking about the guy who said Musk's rocket blew up

So long as someone says at least one true thing, then it doesnt matter if they then say false things immediately following that?

Do you also follow this model? If I look into your past posts here will I see this same pattern? You say one true thing then lie your ass off afterwards? Is that it?

Comment Re:No return trips? (Score 1) 474

So 200+ ships at billions of dollars each? Yeah, that's gonna happen.

The US spent more than that dropping bombs on the Middle East, not benefit was achieved by the effort, and almost nobody wanted it.

Imagine if all the people on Earth actually wanted something and we could effectively solve the coordination problem (coming soon to a blockchain near you)

Comment Re:Only when it costs them money. (Score 1) 113

There are a few options but all of them require high-jacking IoT devices.

If I were feeling more energetic I'd pull out some comments from here I left a decade ago talking about a guild of Internet engineers and a trust system where certified operators could send cryptographically-signed messages upstream to shut off attacking ports (or requests to do so - that's a local detail).

Yes, we're decentralized, and that's good, but we also need to cooperate.

When homeowners get their Internet shut off because their IoT is attacking and they have to call a local tech to diagnose the problem and pull out the offending light bulb before it's turned back on, suddenly everybody will demand secure light bulbs (except us 'luddites' who are still using dumb dishwashers because we know that complexity breaks).

Comment Re:"new"? huh (Score 2) 78

From an optimal information-theory viewpoint:

1000 bits, 168 of which are true (the 168 primes below 1000) and 832 are false.

Given this model of the data, the optimal encoding of the 1's should use 2.573 bits each, while the optimal encoding of the 0's should use 0.265 bits each.:

ln(1000/168) / ln(2) = 2.573... bits
ln(1000/832) / ln(2) = 0.265... bits

The space required for the entire sequence:

(168 * ln(1000/168) + 832 * ln(1000/832)) / ln(2) = 653.109... bits

The model would have to be significantly better than this. Improvements to the model must produce a better probability estimate than the raw statistics. The most obvious single improvement to the model is to treat the even numbers as special, where a simple rule can give you 100% model accuracy and thus use 0 bits to encode all the even positions.

To get the entire sequence down to only 10 bits implies a very large computational overhead, because even with the prime probability all the way down to 1/1000 the list would still be:

(1 * ln(1000/1) + 999 * ln(1000/999)) / ln(2) = 11.407... bits.

So i'm thinking the O(N^1/3) doesnt apply to small values of n, but instead only to enormous values of n.
Earth

Study: Earth Is At Its Warmest In 120,000 Years (washingtonpost.com) 183

An anonymous reader quotes a report from Washington Post: As part of her doctoral dissertation at Stanford University, Carolyn Snyder, now a climate policy official at the U.S. Environmental Protection Agency, created a continuous 2 million year temperature record, much longer than a previous 22,000 year record. Snyder's temperature reconstruction, published Monday in the journal Nature, doesn't estimate temperature for a single year, but averages 5,000-year time periods going back a couple million years. Snyder based her reconstruction on 61 different sea surface temperature proxies from across the globe, such as ratios between magnesium and calcium, species makeup and acidity. But the further the study goes back in time, especially after half a million years, the fewer of those proxies are available, making the estimates less certain, she said. These are rough estimates with large margins of errors, she said. But she also found that the temperature changes correlated well to carbon dioxide levels. Temperatures averaged out over the most recent 5,000 years -- which includes the last 125 years or so of industrial emissions of heat-trapping gases -- are generally warmer than they have been since about 120,000 years ago or so, Snyder found. And two interglacial time periods, the one 120,000 years ago and another just about 2 million years ago, were the warmest Snyder tracked. They were about 3.6 degrees (2 degrees Celsius) warmer than the current 5,000-year average. Snyder said if climate factors are the same as in the past -- and that's a big if -- Earth is already committed to another 7 degrees or so (about 4 degrees Celsius) of warming over the next few thousand years. "This is based on what happened in the past, Snyder noted. "In the past it wasn't humans messing with the atmosphere."

Slashdot Top Deals

Never tell people how to do things. Tell them WHAT to do and they will surprise you with their ingenuity. -- Gen. George S. Patton, Jr.

Working...