Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Trust the World's Fastest VPN with Your Internet Security & Freedom - A Lifetime Subscription of PureVPN at 88% off. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×

Comment Re:Step one in seizing power, control information. (Score 1) 372

Lock down information sources such as the news media. Ensure that all information which is released is fully vetted to support government policies and decrees.

Once information is fully controlled, police activity to enforce government policy can proceed unabated with little fear of meeting organized resistance. President Trump appears to have learned quite well from history.

If by 'learned quite well from history', you include the last 8 years, then you're making a reasonable point. Obama spent 8 years weaponizing the federal government, and then handed it over to Trump. Think about that next time your champion is elected.

Comment A BS Narrative? Rhodes is getting kicked out of WH (Score 2) 432

Maybe it costs 1.6 billion to build a new factory in Mexico, and $700 million modernizing an existing plant in the United States. Under the previous rules they thought were going to be in place, they would have recouped the $900 million dollar difference. Trump's plan is to incentivize building in the US, disincentivize building elsewhere- and this changes the risks and calculations associated with the project.
So I wouldn't say the 'Narrative is clearly not true.' With Gruber, Rhodes, and Clinton continuously lying to the America public I can see where you'd get the idea that a 'narrative' would be pushed regardless of the facts on the ground, but please consider that not everyone operates that way.

Comment Scientists are not the ubermensch (Score 2) 371

This. If scientists discovered that [problem X] was no longer a major concern, they would devote their attention to something else.

But oh no, major conspiracy, scientists have vested interests in maintaining a lie for the sake of their careers. BULLSHIT. Scientists are very much interested in the truth. They are trained to seek it, uncover it, present it, and call their colleagues on any attempts to hide it.

The problem is that scientists discover things that are very uncomfortable for certain interests who have lots of money at stake. And those interests spend their money on attempting to discredit what scientists discover.

Scientists are people too, with the same egos, prejudices, fears, and irrational beliefs the rest of us have. Ideally, through honest application of their work, they can filter out these human elements and present to the rest of us objective facts. However, I think any of us who are widely read and have been paying attention know that there is quite a lot of 'standard' human behavior that occurs in scientific circles.

So, perhaps they are trained as you say, but one cannot claim they act as they are trained in a fully consistent manner. So no, scientists aren't some breed of ultra-rational super humans. Stop pretending someone is above suspicion just because they claim the title 'scientist.'

Comment Mad duck Obama (Score 1) 821

It's just another example of Obama stirring up as much crap as he can in his final days in office; both to screw things up for Trump, and to implement some of his ideas that are deeply unpopular.
A man of honor and dignity would be a much more modest caretaker of government business during the final weeks of his tenure. Instead, Obama is trying to start fights with Russia, has orchestrated a UN backstab of a traditional US ally, and is spewing out regulations that won't survive their first challenge in court. This is what we elected. Twice. This is the man he's always been. If it wasn't for the sycophantic media, it would have been clear to most Americans by 2012.

Comment Re:FP16 isn't even meant for computation (Score 1) 55

So, one problem is that there is not always more data. In my field, we have a surplus of some sorts of data, but other data requires hundreds of thousands of hours of human input, and we only have so much of that to go around. Processing all of that is easy enough, getting more is not.

Also, by "effective", I should have made it clear that I meant "an effective overall solution to the problem", which includes all costs of training a wider, lower-precision network. This includes input data collection, storage and processing, all of the custom software to handle this odd floating point format, including FP16-specific test code and documentation, run time server costs and latency, any increased risks introduced by using code paths in training and , etc.

I'm not saying that I don't believe it's possible, I've just seen absolutely no evidence that this is a significant win in most or even a sizable fraction of cases, or that it represents a "best practice" in the field. Our own experiments have shown a severe degradation in performance when using these nets w/out a complete retraining, the software engineering costs will be nontrivial, and much of the hardware we are forced to run on does not even support this functionality.

As an analog, when we use integer based nets and switch between 16-bit and 8-bit integers, we see an unacceptable level of degradation, even though there is a modest speedup and we can use slightly larger neural nets. I'm very wary of anything with a mantissa much smaller than 16 bits for that reason--those few bits seem to make a significant difference, at least for what we're doing. We're solving a very difficult constrained optimization problem using markov chains in real time, and if the observational features are lower fidelity, the optimization search will run out of time to explore the search space effectively before the result is returned to the rest of the system. It's possible that the sensitivity of our optimization algorithm to input quality is the issue here, not the fundamental usefulness of FP16, but I'm still quite skeptical. If this were a "slam dunk", I'd expect to see it move through the literature in a wave like the Restricted Boltzmann Machine did.

Oh, and thank you for the like (great reading) and the thoughtful reply. Not always easy to find on niche topics online.

Comment Re:Exploitative by design? (Score 1) 153

It seems like these systems are exploitative by design, even if exploitation wasn't explicitly the goal. They're designed with every possible algorithm and available data to maximize labor output at the lowest possible cost. Individual workers are operating at extreme information asymmetry and against a system which does not negotiate and only offers a take it or leave it choice.

This is by far the best comment I've ever seen regarding this sort of algorithmic labor management.

Normally I'm all for this sort of thing--my company is a client and uses it to handle large bursts of data processing quickly--but the information symmetry argument is a powerful one. Also, there doesn't seem to be a lot of competition in this space, which might otherwise ameliorate a lot of the problems induced by the "take it or leave it" bargaining approach.

The analysis provided by the article is absurd, but yours seems to lead to the inescapable conclusion that some kind of regulation is necessary to prevent blatant exploitation. Maybe just reducing information asymmetry in some way, or requiring transparency in reports available to the public on the website regarding effective wages paid to workers as a fraction of the minimum and average wages of employees in their respective countries. Surely someone can find an answer to this.

Comment Re:FP16 isn't even meant for computation (Score 1) 55

Accidentally posted as anonymous coward, reposting under my actual name.

So they're all excited about the lowest-precision, smallest-size floating point math in IEEE 754?

FP16 is good enough for neural nets. Do you really think the output voltage of a biological neurons has 32 bits of precision and range? For any given speed, FP16 allows you to run NNs that are wider and deeper, and/or to use bigger datasets That is way more important than the precision of individual operations.

There's a lot of rounding error with FP16. The neural networks I use are 16-bit integers, which work much, much better, at least for the work I'm doing. Also, do you have a good citation that FP16 neural networks are, overall, more effective than FP32 networks, as you've described?

Comment Urgh...I hated that book. (Score 1) 227

I don't know how Heinlen gets so much credit for this book...it's was a rambling, shambolic pulp thing with sex and politics wedged into it at every opportunity in a vain attempt to perk it up a bit. It's not a book that has "stood the test of time" at all. If there's money for classic SciFi, we need someone to get off their butts and make "RingWorld". It's time.

Slashdot Top Deals

Their idea of an offer you can't refuse is an offer... and you'd better not refuse.

Working...