Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re: It's because AMD quit (Score 1) 240

I see it as pretty obvious that x86/x86-64 CPUs stalled out because Intel decided to milk the market due to the lack of competition (from AMD). Look at Intels offerings in the Xeon E7 Family for examples of exactly why I say this is obvious. It's not like chips that are far better than what is currently offered to average consumers do not exist, they do - they are just priced outrageously.

If AMD delivers with Ryzen and offers something with a good IPC and lots of cores at half the price of Intel then perhaps they will lower their prices on some of their low-end chips.

My guess is what will really force them to finally innovate a bit will be pressure from ARM. Hardware x264 and x265 video decoding and x264 video encoding has been standard in ARM chips for years. Intel just got x264 decoding. They don't have 265 decoding and they don't have any hardware video encoding. I could go on but my point is that Intel has fallen way behind because they figured they didn't have competition.

Comment Re: No SATA and no RAM expandability (Score 1) 240

SATA could be fixed. The CPU doesn't support SATA but it does have limited PCI-express so motherboards could just add controller chip. The RAM limit can't be solved that easily, most ARM chips are made with mobile phones in mind and they are generally limited to 4GB or something like that, the most I've seen is 6GB. Motherboard makers just solder on the max amount supported.

I do love that these small boards have neat things that desktop computers just don't have, like 4k camera support and hardware x264 video encoding. How many laptops are sold with anything beyond a garbage 720p camera? You can probably count them on one hand if they even exist.

Comment Re: ARM Processors coming to Desktops? (Score 2) 240

Yes, they are absolutely coming to .. servers and laptops and eventually desktops. Remember, all we need is the right major crisis and the nations will accept the GNU World Order (Many think David Rockefeller said "New World Order" but GNU is actually pronounced new).

Today we have something called ReactOS which is an Android distribution for x86/x86-64 computers. I have an older laptop that I put Fedora and ReactOS on in dual-boot and this let me do something interesting: Benchmark Android on said laptop using the same benchmarking software you'd use on any Android device. Guess what, that AMD E1-6010 CPU is weaker than my current cellphone.

Many people will naturally protest that running win32 software on ARM will be painfully slow. While this is true it's also irrelevant for most people. You don't need win32 to browse websites or post on SpyBook.

Comment Re: The most important step IS backwards (Score 1) 107

..and it's always been _the_ most important part of Windows. Backward- compatibility which allows all the existing programs to work is exactly why Windows remains widely used today. If Windows 10 only allowed those "universal applications" applications made for it then it would probably have close to zero users and installations.

Windows RT ran on ARM and it was an epic failure because you couldn't run software made for Windows XP or Windows 7 or other older Windows platforms on it. Now Microsoft is trying again, this time allowing you to run win32 software on the ARM version of their OS. It probably won't be a big success but it does have a chance this time around.

Comment That's a worry! (Score 5, Insightful) 30

So a precisely tuned flicker-frequency (40Hz in mice) does great things for brain function and maintenance -- so what deleterious effects do things like CRT monitors, mains-powered fluro/LED lighting etc have on our brains -- given that they're operating "out of sync" with our gamma waves?

Could it be that the increase in dementia/Alzheimer's is related to our exposure to such off-frequency flickering on a very wide scale, thanks to modern technology?

Comment Re:trump never said that (Score 3, Interesting) 600

Preface: I voted against Trump.

In the first clip I'm noticing that Trump refers to borders and walls suggesting his mind is in the context of immigration from the south. That would mean his comments about databases refer to immigration in general. Islam isn't referenced until late in the clip, and then by the interviewer rather than Trump. My conclusion: Trump and the interviewer are talking about two different things. It's unclear if the interviewer intended for that to happen. It's also unclear whether some of the interview from before the clip we see would've established a Muslim context to what we see.

In the second clip Trump seems to try to avoid the question. I can interpret that as him being evasive or as him being annoyed at the question. Being annoyed would be understandable if Trump has not proposed a Muslim database. I haven't seen evidence he has. A smarter politician would've taken the opportunity to say "Muslim database? That's horrible idea and I'm against it! Now an immigration database would be handy to have in the unlikely event Canada invades..." if he has not proposed a Muslim database, but I don't think Trump is very smart (see my preface).

Comment Re:FP16 isn't even meant for computation (Score 1) 55

So, one problem is that there is not always more data. In my field, we have a surplus of some sorts of data, but other data requires hundreds of thousands of hours of human input, and we only have so much of that to go around. Processing all of that is easy enough, getting more is not.

Also, by "effective", I should have made it clear that I meant "an effective overall solution to the problem", which includes all costs of training a wider, lower-precision network. This includes input data collection, storage and processing, all of the custom software to handle this odd floating point format, including FP16-specific test code and documentation, run time server costs and latency, any increased risks introduced by using code paths in training and , etc.

I'm not saying that I don't believe it's possible, I've just seen absolutely no evidence that this is a significant win in most or even a sizable fraction of cases, or that it represents a "best practice" in the field. Our own experiments have shown a severe degradation in performance when using these nets w/out a complete retraining, the software engineering costs will be nontrivial, and much of the hardware we are forced to run on does not even support this functionality.

As an analog, when we use integer based nets and switch between 16-bit and 8-bit integers, we see an unacceptable level of degradation, even though there is a modest speedup and we can use slightly larger neural nets. I'm very wary of anything with a mantissa much smaller than 16 bits for that reason--those few bits seem to make a significant difference, at least for what we're doing. We're solving a very difficult constrained optimization problem using markov chains in real time, and if the observational features are lower fidelity, the optimization search will run out of time to explore the search space effectively before the result is returned to the rest of the system. It's possible that the sensitivity of our optimization algorithm to input quality is the issue here, not the fundamental usefulness of FP16, but I'm still quite skeptical. If this were a "slam dunk", I'd expect to see it move through the literature in a wave like the Restricted Boltzmann Machine did.

Oh, and thank you for the like (great reading) and the thoughtful reply. Not always easy to find on niche topics online.

Comment Re:Exploitative by design? (Score 1) 153

It seems like these systems are exploitative by design, even if exploitation wasn't explicitly the goal. They're designed with every possible algorithm and available data to maximize labor output at the lowest possible cost. Individual workers are operating at extreme information asymmetry and against a system which does not negotiate and only offers a take it or leave it choice.

This is by far the best comment I've ever seen regarding this sort of algorithmic labor management.

Normally I'm all for this sort of thing--my company is a client and uses it to handle large bursts of data processing quickly--but the information symmetry argument is a powerful one. Also, there doesn't seem to be a lot of competition in this space, which might otherwise ameliorate a lot of the problems induced by the "take it or leave it" bargaining approach.

The analysis provided by the article is absurd, but yours seems to lead to the inescapable conclusion that some kind of regulation is necessary to prevent blatant exploitation. Maybe just reducing information asymmetry in some way, or requiring transparency in reports available to the public on the website regarding effective wages paid to workers as a fraction of the minimum and average wages of employees in their respective countries. Surely someone can find an answer to this.

Comment Re:Not surprising (Score 1) 255

How much of all this is just misanthropy.

Plenty of CEOs drinking and operating companies; plenty of sociopaths, too. Such are planning to replace tens of millions of people and crash a good chunk of the planet into depression. I'll be impressed when the CEOs get replaced by AI.

Circuits are by definition the opposite of real world. The Pittsburg taxis have drivers. The Ohio trucks have drivers. And they are crashing plenty; they just are not telling us about it.

Slashdot Top Deals

The finest eloquence is that which gets things done.

Working...