[...] early Apple III computers where heat would cause chips to expand out of their sockets, [...]
“It’s not wise to upset an Apple III.”
“But sir...no one worries about upsetting a Droid.”
“That’s ’cause a Droid don’t cause people’s chips to expand out of their sockets. Apple IIIs have been known to do that.”
“I suggest a new strategy, Artoo. Let the Apple III win.”
That said, in my own personal code, I never use tabs.
Pretty much. And the issue is that tabs gone wrong is only visible once its viewed by someone with different settings.
Not exactly true. You can pretty easily write a tool that scans a source module for problematic mixing of tabs and spaces. Just require that all changes pass that scan before they are allowed to be checked in.
Which only matters if all indentation, including alignment, is done with tabs. The moment you throw in a few spaces to line something up on a non-tab boundary (say, to align a second line of arguments with the first argument), then you have a mess, unless your tab width is set to exactly the value that whoever touched the code before you set it to.
What?? Nonsense. Using spaces to line something up on a non-tab boundary is exactly what avoids a mess, not creates it.
To use tabs in code, with zero problems whatsoever, follow these simple rules:
1. Use tabs only for indentation, never for alignment.
2. Tabs may never appear anywhere in the source code except as a contiguous sequence of zero or more tabs at the beginning of a line.
3. Use spaces for alignment, never for indentation.
4. Spaces may follow tabs, but tabs may never follow spaces.
All the lines in your module should match the following regex:
If you have a for (...) loop that splits across three lines, there should be n tabs leading up to the for, and then on each of the following two lines there should be n tabs followed by 5 spaces, for proper alignment.
4K is the limit of human visual perception.
Complete and utter BS.
The limit of visual perception depends entirely on the viewing distance and on pixel density.
Also — and this is even true today — if you do all your calculations in base 10 instead of binary, you get a different (sometimes more desirable) behavior of rounding. For example, financial calculations are almost always best done in base 10 rather than base 2. No self-respecting spreadsheet program does its financial arithmetic in base 2.
Working in binary, on the other hand, requires costly conversion in and out of human-readable decimal. For example, converting decimal to binary requires a costly multiplication (by 10) on each digit consumed, and converting back to decimal from binary requires a costly division (by 10) one each digit produced.
So for things like scores in games, yeah, BCD is a nice thing.
6502.org Tutorials: Decimal Mode
ST-C absolutely dead-on nails the look and feel of ST-TOS in every way.
Seconded. Other than the actors being different, it really, honestly, truly feels like you're watching lost episodes of ST:TOS straight out of the 1960s.