Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×

Comment Re:Too little at any time (Score 2) 138

This. School isn't about making experts in the subjects, there's simply no time for that. It's about enough exposure to different subjects so you can (a) find your own thing, and (b) get some idea of the wide and diverse world you'll be living in.

Incidentally, I'm about to teach a small course/workshop in algorithmic art at a local school. I'm not expecting all of them to become algorithmic artists, but I hope they'll learn something about using math and code to express their ideas.

Comment Re:Practical? (Score 1) 139

The bit change is not necessary for computation at all from information theory perspective. Theoretically, no energy needed at all for any computation. Whatever, you can do with active circuit, can be done using passive circuit (e.g. your camera lens can be used for FFT). The energy is only needed for reading information. So no matter, how complex the cryptography is, the theoretical energy required to decrypt is zero.

Yes and no. In my understanding as a physicist, bit flipping per se is free, but you need a minimum of 1/2 kT of energy to destroy information (create entropy). To avoid destroying information during computation, you basically need to store every step you do, so the operation becomes reversible (google "reversible computing" for more). This is not usually practical, so most of computing does suffer from the 1/2 kT limit per bit operation.

The lens example is valid IMHO, as Fourier transform is reversible (and there are similar integer transforms to stay bit-exact, if you're worried about floats.) But to make that practical, you need to store all that information somewhere.

Comment Re:Unix-like directories and Go whining? Stop it. (Score 1) 58

Well Brian, to wrap your head around things you can relate to, better toss that MacBook you authored your article on (BSD-variant and Unix-like directory structure), stop watching Netflix (hosted on Linux and some distributed POSIX-friendly Unix-like filesystem), don't put anything on Dropbox anymore (hosted on Linux and some distributed POSIX-friendly Unix-like filesystem). Get my point? Stop whining. Just because it's over your head, doesn't mean it's not over anyone elses.

Also, try using the web with URLs like http:\\backslashdot.org\ to avoid the Unixy feel.

Comment Re:Too many cores. (Score 1) 77

They've taken a crappy, underpowered chip that was trimmed to the bone to try and make something that competes with Arm, and are hacking on extras to make it sound more like a Xeon.

So it's like taking Pentium 3 and hacking on extras from Pentium 4 (the actual innovations around the core, not the GHz race) to make Pentium M, then putting several of these on a single die to make the Core series? Not a bad idea.

Or could this be Intel's trick, that they've taken a Core 2 Mobile CPU, scraped off the Penryn label, reprinted it as Atom++, and are shipping those?

I think this already happened a while ago, in a way. For instance, the original Atoms didn't have out-of-order execution, but the later ones do: https://en.wikipedia.org/wiki/... It looks a bit like the Pentium brand that lives on as the low end of Cores.

BTW, I have one of the earlier in-order Atoms running happily in a server-ish machine where GPUs do all the heavy lifting. It's perfect for the job, and I guess more Atom cores would be great for a lot of server tasks, at least given enough I/O. Ideally, something like ARM or MIPS would probably be even better, but good luck finding (a) a suitable mobo with all the PCIe slots and (b) AMD/Nvidia binary drivers.

Comment Re:Some hints (Score 1) 118

(1) If you are near sighted (which I am), have your the prescription *slightlt* detuned, so it isn't perfect. Mine is detuned by I think around 0.25. This reduces eye strain by a HUGE amount. You won't be able to read highway signs from far away but who needs to do that any more with gps nav?

Ah, I was just posting about this below, so let me ask: why not have separate glasses for computer work?

Comment What about optical power? (Score 3, Informative) 118

I'm myopic, and I often read books without glasses, but the computer screen is a little too far for that. So I sometimes find it easier to use my old glasses for computing, compared to my regular glasses with a stronger correction. Around here, "computer glasses"* refer to glasses with the optical power optimized for screen distances. It's something you can get from your employer as a health benefit if you work at a screen all day.

I also use redshift on Linux to tone down the blues (the colour component) during the night, but it's a completely orthogonal issue. Plus if you're worried about computing ruining your sleep, there's also the psychological buzz, so I'm not sure which one dominates in practice.

*(One common term is "päätelasit" meaning "terminal glasses", not necessarily because you're so old they're the last glasses you'll ever need, but because our computing term-inology is ancient and we still think in terms of terminals.)

Slashdot Top Deals

If you have a procedure with 10 parameters, you probably missed some.