Q: Who is susceptible to deception? A: Everyone.
Deceivers don't appeal to logic.
I've been using this site for over twenty years, and it's a been most of a decade since I've commented. This is the best thing I've seen on here since then. Whatever you do, keep drumming up the fight against ignorance and propaganda, and the people who've fallen victims of it. I don't want to get personal, but lets just say that I know from intimate experience what brainwashing does to a person, and the tremendous cost of clawing one's way out of it. Division in modern society is inevitable--and we must fight against those who seek to destroy rational thought!--but without empathy for those infected by bad ideas, shortchanged by their personal experiences, we'll end up punishing and alientating those victimized by bad actors exploiting cognitive vulnerabilities that every one of us has, we will push them out of sheer self-defense into voting in the people who will undo us.
Show me a single biological female who has ever been involved in jetpack development or flying.
Go ahead, move the goalposts. And obviously, who ever heard of Amelia Earhart?
Not the person you're responding to, but I'm pretty sure their less-than-polite phrasing meant "biological female who has ever been involved in jetpack development or *jetpack* flying".
Everyone knows Amelia Earhart was a big part of aviation history in that era, but I strongly suspect that she didn't moonlight as the Rocketeer.
For instance, there's the matter of how to treat trade secrets, which are common in computer code. In many cases, the creator of a work doesn't even have a right to distribute source code that they've purchased a license to (say, a game engine) and have modified, so this is untenable unless you are willing to make entire business models completely flat.
I suspect on re-reading your comment that you mean the portion of a work that is distributed--in this case the game client. There are less issues with that, but still licensed assets are a fairly reasonable part of the copyrighted works market. Perhaps unlimited duplication after a lapse time would be allowed, but derivative works would not be?
It's an interesting thought. It's not going to happen, but something like it's now on my wishlist.
In fact the only mention of the word "ownership" in copyright law is in the paragraph stating all works under copyright are the inheritance of the public to own, once the copyright term has expired.
Actually, at 30 mentions of owners and ownership in Title 17, Chapter 2 alone, you are dead wrong:
https://www.copyright.gov/titl...:
Read all of the laws there. You will find plenty more mentions. And in case you try to backpedal and amend your statement, since the term is used to describe the copyright itself, and not the work, you can find the term "owner of a work" and "ownership of a work" in multiple official documents associated with our government's various copyright bodies:
https://www.federalregister.go...
https://www.copyright.gov/docs...
https://www.copyright.gov/poli...
I'm not saying I agree with US copyright law, but lets get our facts straight. Your conclusions may (or may not) be valid, but that particular argument regarding legal wording is so wrong that I have to wonder if you've even read these laws.
Bonus: Contrary to your main argument, DCMA *does* in fact prohibit actions involving circumvention of copyright--many of which are actions taken for personal use, say, displaying a legitimate copy of a video from a computer by illegally circumventing HDCP or the like. This is absurd, but that's how the law was written, and I doubt it was put there by accident.
So, one problem is that there is not always more data. In my field, we have a surplus of some sorts of data, but other data requires hundreds of thousands of hours of human input, and we only have so much of that to go around. Processing all of that is easy enough, getting more is not.
Also, by "effective", I should have made it clear that I meant "an effective overall solution to the problem", which includes all costs of training a wider, lower-precision network. This includes input data collection, storage and processing, all of the custom software to handle this odd floating point format, including FP16-specific test code and documentation, run time server costs and latency, any increased risks introduced by using code paths in training and , etc.
I'm not saying that I don't believe it's possible, I've just seen absolutely no evidence that this is a significant win in most or even a sizable fraction of cases, or that it represents a "best practice" in the field. Our own experiments have shown a severe degradation in performance when using these nets w/out a complete retraining, the software engineering costs will be nontrivial, and much of the hardware we are forced to run on does not even support this functionality.
As an analog, when we use integer based nets and switch between 16-bit and 8-bit integers, we see an unacceptable level of degradation, even though there is a modest speedup and we can use slightly larger neural nets. I'm very wary of anything with a mantissa much smaller than 16 bits for that reason--those few bits seem to make a significant difference, at least for what we're doing. We're solving a very difficult constrained optimization problem using markov chains in real time, and if the observational features are lower fidelity, the optimization search will run out of time to explore the search space effectively before the result is returned to the rest of the system. It's possible that the sensitivity of our optimization algorithm to input quality is the issue here, not the fundamental usefulness of FP16, but I'm still quite skeptical. If this were a "slam dunk", I'd expect to see it move through the literature in a wave like the Restricted Boltzmann Machine did.
Oh, and thank you for the like (great reading) and the thoughtful reply. Not always easy to find on niche topics online.
It seems like these systems are exploitative by design, even if exploitation wasn't explicitly the goal. They're designed with every possible algorithm and available data to maximize labor output at the lowest possible cost. Individual workers are operating at extreme information asymmetry and against a system which does not negotiate and only offers a take it or leave it choice.
This is by far the best comment I've ever seen regarding this sort of algorithmic labor management.
Normally I'm all for this sort of thing--my company is a client and uses it to handle large bursts of data processing quickly--but the information symmetry argument is a powerful one. Also, there doesn't seem to be a lot of competition in this space, which might otherwise ameliorate a lot of the problems induced by the "take it or leave it" bargaining approach.
The analysis provided by the article is absurd, but yours seems to lead to the inescapable conclusion that some kind of regulation is necessary to prevent blatant exploitation. Maybe just reducing information asymmetry in some way, or requiring transparency in reports available to the public on the website regarding effective wages paid to workers as a fraction of the minimum and average wages of employees in their respective countries. Surely someone can find an answer to this.
So they're all excited about the lowest-precision, smallest-size floating point math in IEEE 754?
FP16 is good enough for neural nets. Do you really think the output voltage of a biological neurons has 32 bits of precision and range? For any given speed, FP16 allows you to run NNs that are wider and deeper, and/or to use bigger datasets That is way more important than the precision of individual operations.
There's a lot of rounding error with FP16. The neural networks I use are 16-bit integers, which work much, much better, at least for the work I'm doing. Also, do you have a good citation that FP16 neural networks are, overall, more effective than FP32 networks, as you've described?
Any given program will expand to fill available memory.