Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re:a nice Linux trick (Score 1) 677 677

Swapping Caps and Escape has changed my life to the extent I would consider carrying little stickers around for all the keyboards I remap. Unsurprisingly, I'm a fairly heavy vim user...

If you swap caps with control, then in vim you can use Control-[ for escape, which is also a pretty good option. (That's what I do.)

On Gnome 3, I have recently switched from using xmodmap to setting org.desktop.gnome.input-sources.xkb-options to ['caps:swapescape'] in dconf-editor. Much easier.

The problem I had was I wanted to map the CapsLock key to Control, and the LeftCtrl key to Hyper. There are built-in xkb rules for doing both, but they don't work right when you try to combine them, and the documentation available for xkb is hard to understand, to say the least. I finally got it working by hacking one of the rules definitions in the system-provided definition files to do what I wanted -- I gave up trying to figure out how to do it the "right" way.

Comment Re:a nice Linux trick (Score 1) 677 677

Unfortunately it's hard to get xmodmap to play nicely in cases like hotplugging keyboards or using Gnome, at least under Linux. There is a standard option for swapping caps lock and control (or escape, or several other similar options), but if you want a layout option that isn't built-in, then prepare yourself for a world of frustration.

Comment SubjectsSuck (Score 3, Interesting) 677 677

I can I invest in a real alternate keyboard with a different layout...

You could also invest in a tool to remap the key next to 'A' to "Control," like God intended. You don't need to get a whole new keyboard. Write on the key with a marker if you're the kind of person that looks at key labels when you're typing.

Comment I was thinking of "high end" in terms of (Score 1) 151 151

what consumers had access to by walking into a retail computer dealership (there were many independent white box makers at the time) and saying "give me your best."

You're probably right about me underestimating the graphics, though it's hard to remember back that far. I'm thinking 800x600 was much more common. If you could get 1024x768, it was usually interlaced (i.e. "auto-headache") and rare if I remember correctly to be able to get with 24-bit color—S3's first 16-bit capable chips didn't come out until late-1991, if I remember correctly, though I could be off.

SCSI was possible, but almost unheard of as stock, you either had to buy an add-on card and deal with driver/compatibility questions or one of the ESDISCSI bridge boards or similar. Same thing with ethernet, token, or any other dedicated networking hardware and stack. Most systems shipped with a dial-up "faxmodem" at the time, and users were stuck using Winsock on Windows 3.1. It was nontrivial to get it working. Most of the time, there was no real "networking" or "networking" support in the delivered hardware/software platform; faxmodems were largely used for dumb point-to-point connections using dial-up terminal emulator software.

And in the PC space, the higher-end you went, the less you were able to actually use the hardware for anything typical. Unless you were a corporate buyer, you bought your base platform as a whitebox, then added specialized hardware matched with specialized software in a kind of 1:1 correspondence—if you needed to perform task X, you'd buy hardware Y and software Z, and they'd essentially be useful only for task X, or maybe for task X1, X2, and X3, but certainly not much else—the same is even true for memory itself. Don't forget this is pre-Windows95, when most everyone was using Win16 on DOS. We can discuss OS/2, etc., but that again starts to get into the realm of purpose-specific and exotic computing in the PC space. There were, as I understand, a few verrry exotic 486 multiprocessors produced, but I've never even heard of a manufacturer and make/model for these—only the rumor that it was possible—so I doubt they ever made it into sales channels of any kind. My suspicion (correct me if I'm wrong) was that they were engineered for particular clients and particular roles by just one or two orgnaizations, and delivered in very small quantities; I'm not aware of any PC software in 1992 timeframe that was even multiprocessor-aware, or any standard to which it could have been coded. The Pentium processor wasn't introduced until '93 and the Pentium Pro with GTL+ and SMP capabilities didn't arrive until 1995. Even in 1995, most everything was either Win16 or 8- or 16-bit code backward compatible to the PC/XT or earlier, and would remain that way until around the Win98 era.

The UNIX platforms were standardized around SCSI, ethernet, big memory access, high-resolution graphics, and multiprocessing and presented an integrated environment in which a regular developer with a readily available compiler could take advantage of it all without particularly unusual or exotic (for that space) tactics.

Comment So funny, but yeah, totally true. (Score 1) 151 151

The 386 box that I installed Linux on my first time around was 4MB (4x1MB 30-pin SIMMs). 4MB! I mean, holy god, that's tiny. It seemed sooooo big compared to the 640kb of 8-bit PCs, and yet it's basically the same order of magnitude. Not even enough to load a single JPG snapshot from a camera phone these days.

Comment Wow, end of an era. (Score 4, Interesting) 151 151

For more than just a couple of us here, I suspect, there was a time when "Sparc," "UNIX," "graphics," "Internet," and "science" were all nearly synonymous terms.

Simpler times. Boy did that hardware last and last and last in comparison to the hardware of today.

Well, I suppose it can finally no longer be said that the Sparcstation 10 I keep here just for old times' sake can still run "current Linux distributions." But it's still fun to pull it out for people, show them hundreds of megabytes of RAM, 1152x900 24-bit graphics, gigabytes of storage, multiple ethernet channels, and multiple processors, running Firefox happily, and tell them it dates to 1992, when high-end PCs were shipping with mayyybe 16-32GB RAM, a single 486 processor, 640x480x16 graphics, a few dozen megabytes of storage, and no networking.

It helps people to get a handle on how it was possible to develop the internet and do so much of the science that came out of that period—and why even though I don't know every latest hot language, the late '80s/early '90s computer science program that I went to (entirely UNIX-based, all homework done using the CLI, vi, and gcc, emphasis on theory, classic data structures, and variously networked/parallelized environments, with labs of Sparc and 88k hardware all on a massive campus network) seems to have prepared me for today's real-world needs better than the programs they went to, with lots of Dell boxes running Windows-based Java IDEs.

Comment As someone that works with data sets all the time, (Score 4, Informative) 143 143

here are my answers. Spreadsheets are used in several cases:

1) When you have a small-to-medium-sized dataset (100m data points) and want to do a particular set of calculations or draw a particular set of conclusions from it just once or twice—so that the time invested in writing code in R or something similar is less than the time needed just to bung a few formulas into a spreadsheet and get your results. Once you get into analyses or processes that will be repeated many times, it makes more sense to write code.

2) Similar case, when you need to work with essentially tabular database data, but the operations you're performing (basic filtering, extracting records based on one or two criteria, just handing data from one person to the next) are either so simple or will be repeated so rarely that a MySQL database is overkill and just emailing a file back and forth is easier.

3) When you are working with data as a part of a team, and certain members of the team that are specialists in some areas related to the data, or (for example) members of the team that are doing your data collections, aren't particularly computationally expert. Spreadsheets are hard for laymen, but it's doable—a dozen or two hours of training and people can get a general, flexible grasp of spreadsheets and formulae. It takes a lot longer for someone to become basically proficient with R, MATLAB, MySQL, Python, etc., and you really want those specialists to just be able to do what they do to or with the data, rather than focusing their time and effort on learning computational tools. Spreadsheets are general purpose and have a relatively shallow learning curve relative to lots of other technologies, but they enable fairly sophisticated computation to take place—if inefficiently at times. They're like a lowest-common-denominator of data science.

We use Spreadsheets all the time in what we do, mostly as a transient form. The "heavy hitting" and "production" data takes place largely in MySQL and R, but there are constant temporary/intermediate moments in which data is dumped out as a CSV, touches a bunch of hands that are really not MySQL or R capable, and then is returned in updated form to where in normally lives.

"And do you think (fop that I am) that I could be the Scarlet Pumpernickel?" -- Looney Tunes, The Scarlet Pumpernickel (1950, Chuck Jones)

Working...