Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Too many pixels = slooooooow (Score 1) 263

Actually, there is no such thing as pure retina resolution. There is only retina resolution as a function of pixel density and viewing distance. So, 4k on a 32" monitor at an 24" viewing distance is retina resolution at typical viewing distance. However, 4k on a 32" monitor at much shorter viewing distance is not.

Comment Re:BCD mode (Score 1) 140

Actually, it was very useful when the 6502 was introduced. Remember, computers were slow back then. Converting a binary number to decimal was especially slow, since it involves division with remainder in a loop, once for each digit produce, and the 6502 had no hardware multiplication or division instructions.

Also — and this is even true today — if you do all your calculations in base 10 instead of binary, you get a different (sometimes more desirable) behavior of rounding. For example, financial calculations are almost always best done in base 10 rather than base 2. No self-respecting spreadsheet program does its financial arithmetic in base 2.

Comment Re:BCD mode (Score 1) 140

BCD mode is useful when you are working with numbers that you want to display to humans often. That is, you can do all the arithmetic in base 10 instead of binary. BCD is slower to work with than binary, but much faster to convert in and out of, since there's basically no conversion other than adding 0x30 (ASCII '0') to the nybble you want to display.

Working in binary, on the other hand, requires costly conversion in and out of human-readable decimal. For example, converting decimal to binary requires a costly multiplication (by 10) on each digit consumed, and converting back to decimal from binary requires a costly division (by 10) one each digit produced.

So for things like scores in games, yeah, BCD is a nice thing.

Comment Re:BCD mode (Score 2) 140

Hmm, you know what? Damn. I misremembered how it worked, and I didn't read closely enough at the link I provided. So, to stand corrected, the D flag was something you set (or cleared) proactively using the SED or CLD instruction. It actually indicated a mode you were in, rather than being a flag in the usual sense (such as a carry or a zero flag).

Comment BCD mode (Score 3, Interesting) 140

BCD (Binary-Coded Decimal) mode was cool because it changed the way adding and subtracting worked. If you added 0x01 to 0x29, you'd get 0x30 instead of 0x2A. This was possible because there were actually two carry flags on the 6502 — one (named C) which was set upon overflow of values greater than 255, and the other (named D) which was set upon overflow of the low nybble (e.g., the low 4 bits).

6502.org Tutorials: Decimal Mode

Comment Re:Gotta disagree strongly . . . (Score 1) 109

Watched Mind-Sifter. I liked it. They got the lighting and the camera style all wrong, though, whereas Star Trek Continues nails these things. Mind-Sifter did not at all feel to me like a a continuation of 60s Trek — it felt more like a reboot. I appreicate that they tried, but they missed the mark. I did really enjoy the tip of the hat to "Nightmare at 20,000 Feet", though, which was playing on the B&W television. :)

Comment Re:Let's shit all over the customers (Score 1) 130

Because they don't really innovate anymore and most of what they do is a regression. The Mac Mini is a perfect example. It really not only failed to advance but in some ways went backwards.

Ya, definitely disappointing there. But on the other hand, doesn't it have a higher CPU-power-per-watt rating? I imagine that this matters more to some people than pure CPU horsepower.

The newer OS upgrades are more about selling you some crap you don't want or need than increasing productivity. Mountain Lion was the last OS that actually seemed like an improvement. My computer ran better with that installation but Mavricks really seems sluggish, so much so I wiped the drive and went back to ML.

Huh, I've noticed the opposite with Mavericks. I only recently (a month ago) upgraded, but I have noticed significantly better battery life with it — especially with Safari not chewing up as many idle CPU cycles.

I hate that upgrades are tied to the Apple Store now. Why???

Ya, that drives me nuts too. However, I think it should still be possible to extract a .dmg installation image after downloading the update and before installing. I did that with Lion in 2011. Downloaded it once from the App Store and installed it on multiple systems from that disk image.

So many little things bother me whereas when I first installed OS X I found the little things to be where they excelled.

Yeah, that's a good point. Not much exciting anymore. And Yosemite looks like a huge step backward in the graphic design department. It's all ugly and flat. I don't look forward to being forced to upgrade to it someday.

I find myself using my Linux laptop more and more over the Mac for general computing use. For video work I still use it but now that I've got ffmpeg fixed from the avconv debacle I'm starting to work with Linux more in that area too. If only hardware manufacturers would support Linux more.

Understandable... But how does any of this mean that Apple is becoming less relevant in the computer world? I think most people out there just don't care or notice.

Slashdot Top Deals

Saliva causes cancer, but only if swallowed in small amounts over a long period of time. -- George Carlin

Working...