Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment I was thinking of "high end" in terms of (Score 1) 152

what consumers had access to by walking into a retail computer dealership (there were many independent white box makers at the time) and saying "give me your best."

You're probably right about me underestimating the graphics, though it's hard to remember back that far. I'm thinking 800x600 was much more common. If you could get 1024x768, it was usually interlaced (i.e. "auto-headache") and rare if I remember correctly to be able to get with 24-bit color—S3's first 16-bit capable chips didn't come out until late-1991, if I remember correctly, though I could be off.

SCSI was possible, but almost unheard of as stock, you either had to buy an add-on card and deal with driver/compatibility questions or one of the ESDISCSI bridge boards or similar. Same thing with ethernet, token, or any other dedicated networking hardware and stack. Most systems shipped with a dial-up "faxmodem" at the time, and users were stuck using Winsock on Windows 3.1. It was nontrivial to get it working. Most of the time, there was no real "networking" or "networking" support in the delivered hardware/software platform; faxmodems were largely used for dumb point-to-point connections using dial-up terminal emulator software.

And in the PC space, the higher-end you went, the less you were able to actually use the hardware for anything typical. Unless you were a corporate buyer, you bought your base platform as a whitebox, then added specialized hardware matched with specialized software in a kind of 1:1 correspondence—if you needed to perform task X, you'd buy hardware Y and software Z, and they'd essentially be useful only for task X, or maybe for task X1, X2, and X3, but certainly not much else—the same is even true for memory itself. Don't forget this is pre-Windows95, when most everyone was using Win16 on DOS. We can discuss OS/2, etc., but that again starts to get into the realm of purpose-specific and exotic computing in the PC space. There were, as I understand, a few verrry exotic 486 multiprocessors produced, but I've never even heard of a manufacturer and make/model for these—only the rumor that it was possible—so I doubt they ever made it into sales channels of any kind. My suspicion (correct me if I'm wrong) was that they were engineered for particular clients and particular roles by just one or two orgnaizations, and delivered in very small quantities; I'm not aware of any PC software in 1992 timeframe that was even multiprocessor-aware, or any standard to which it could have been coded. The Pentium processor wasn't introduced until '93 and the Pentium Pro with GTL+ and SMP capabilities didn't arrive until 1995. Even in 1995, most everything was either Win16 or 8- or 16-bit code backward compatible to the PC/XT or earlier, and would remain that way until around the Win98 era.

The UNIX platforms were standardized around SCSI, ethernet, big memory access, high-resolution graphics, and multiprocessing and presented an integrated environment in which a regular developer with a readily available compiler could take advantage of it all without particularly unusual or exotic (for that space) tactics.

Comment Wow, end of an era. (Score 4, Interesting) 152

For more than just a couple of us here, I suspect, there was a time when "Sparc," "UNIX," "graphics," "Internet," and "science" were all nearly synonymous terms.

Simpler times. Boy did that hardware last and last and last in comparison to the hardware of today.

Well, I suppose it can finally no longer be said that the Sparcstation 10 I keep here just for old times' sake can still run "current Linux distributions." But it's still fun to pull it out for people, show them hundreds of megabytes of RAM, 1152x900 24-bit graphics, gigabytes of storage, multiple ethernet channels, and multiple processors, running Firefox happily, and tell them it dates to 1992, when high-end PCs were shipping with mayyybe 16-32GB RAM, a single 486 processor, 640x480x16 graphics, a few dozen megabytes of storage, and no networking.

It helps people to get a handle on how it was possible to develop the internet and do so much of the science that came out of that period—and why even though I don't know every latest hot language, the late '80s/early '90s computer science program that I went to (entirely UNIX-based, all homework done using the CLI, vi, and gcc, emphasis on theory, classic data structures, and variously networked/parallelized environments, with labs of Sparc and 88k hardware all on a massive campus network) seems to have prepared me for today's real-world needs better than the programs they went to, with lots of Dell boxes running Windows-based Java IDEs.

Comment As someone that works with data sets all the time, (Score 4, Informative) 144

here are my answers. Spreadsheets are used in several cases:

1) When you have a small-to-medium-sized dataset (100m data points) and want to do a particular set of calculations or draw a particular set of conclusions from it just once or twice—so that the time invested in writing code in R or something similar is less than the time needed just to bung a few formulas into a spreadsheet and get your results. Once you get into analyses or processes that will be repeated many times, it makes more sense to write code.

2) Similar case, when you need to work with essentially tabular database data, but the operations you're performing (basic filtering, extracting records based on one or two criteria, just handing data from one person to the next) are either so simple or will be repeated so rarely that a MySQL database is overkill and just emailing a file back and forth is easier.

3) When you are working with data as a part of a team, and certain members of the team that are specialists in some areas related to the data, or (for example) members of the team that are doing your data collections, aren't particularly computationally expert. Spreadsheets are hard for laymen, but it's doable—a dozen or two hours of training and people can get a general, flexible grasp of spreadsheets and formulae. It takes a lot longer for someone to become basically proficient with R, MATLAB, MySQL, Python, etc., and you really want those specialists to just be able to do what they do to or with the data, rather than focusing their time and effort on learning computational tools. Spreadsheets are general purpose and have a relatively shallow learning curve relative to lots of other technologies, but they enable fairly sophisticated computation to take place—if inefficiently at times. They're like a lowest-common-denominator of data science.

We use Spreadsheets all the time in what we do, mostly as a transient form. The "heavy hitting" and "production" data takes place largely in MySQL and R, but there are constant temporary/intermediate moments in which data is dumped out as a CSV, touches a bunch of hands that are really not MySQL or R capable, and then is returned in updated form to where in normally lives.

Comment Re:Can email service providers do more? (Score 1) 58

Regarding your number 2... Frequently get tampered with in transit? Really? I have, literally, never seen this....

You're lucky there. I see such tampering several times per day, and fixing the problem often takes a lot of time (and soto-voce swearing ;-).

The reason is that I deal with a lot of data that's "plain text", but is computer data of some sort, not a natural language like English (which is sorts stretching the meaning of "natural", but you know what I mean). Or it's in a human language, but not English, and the character encoding uses some 2-byte or longer characters.

The simplest example is computer source code. The tampering is often caused by the "punch-card mentality" coded into a lot of email software, which often doesn't allow lines longer than 80 (or 72) characters, and inserts line feeds to make everything fit. Many programming languages consider line feeds to mean something different than a space, usually "end of statement". Inserting a line feed in the middle of a statement thus changes the meaning, and very often introduces a syntax error.

Even nastier is the munging a lot of other plain-text data representation that mixes letters and numbers. Inserting spaces or a line feed in the middle of a token like "G2EF" usually destroys the meaning in a way that can't be corrected automatically at the receiving end. Usually the way to handle such tampering is to reply to the sender, saying "Can you send me that in quoted-printable or base-64 form?" And you try to teach everyone in the group that such data should always be encoded in a form that's immune to the idiocies of "smart" email handlers.

Text in UTF-8 form, especially Chinese and Japanese text, is especially prone to this sort of tampering, which often leaves the text garbled beyond recovery.

Anyway, there are lots of excuses for such tampering with email in ways that destroy the content. It's not always for nefarious reasons; it's just because the programmers only tested their email-handling code on English-language text. And because they're idiots who think that lines of text should never be longer than 80 (or 72) characters.

Comment Re:Redirecting 127.0.0.1 (Score 1) 188

When I noticed that the address was the address of my machine, I did a quick find(1), but couldn't find the IMDB files or the takedown letter. Do you think I should contact Universal Pictures and ask them to send me another copy of the letter, so I can figure out which file to take down?

Actually, I noticed that all of our home machines (we have several, including tablets and smart phones) seem to have the same address. I guess that's to be expected, since ISPs only give us a single address, so we all have to use that silly NAT protocol and try to make sense of the confusion that it always creates. Anyway, I did look around on all of them, and still couldn't find anything with "Universal Pictures" inside. I did find a few files that contain "IMDB", but they're in the browsers' cache directories, and I got rid of those by simply telling the browsers to clear their cache(s).

But somehow I don't think this has taken care of the problem. So who should I contact at Universal Pictures to make sure we get a copy of the letter and purge our machines of their files?

(And for the benefit of many /. readers and mods, maybe I should end this with: ;-) Nah....

Comment Re:Profits are important to allocate resources (Score 1) 93

What rate of return would convince you to put your money in an investment if you knew it was going to be 10 years before you received the first dollar back - and there was a 90%+ chance of failure to boot?

Funny thing; those numbers were used back in the 1980s, with interesting results. The topic wasn't drugs, though, but rather solid-state manufacturing, and very similar numbers were widely quoted in east Asia. At the time, it was generally estimated that to build a new solid-state facility would require several billion dollars, and would take around a decade to become profitable, due to the extreme difficulty of achieving the required low level of contaminants inside the equipment. Much of the decade would be spent making test runs, discovering that the output was useless because of some trace contaminant in one part of the process, and redesigning the setup to get past yet another failure. Success wasn't predictable; the 10-year estimate was just the minimum.

But people in east Asia (mostly Japan and Korea) argued publicly that the American companies that controlled most of the production at the time wouldn't be able to get funding for new factories, because American investors would refuse to invest so much money in something with no payoff for a decade. If Asian investors would step in and support the effort, in 10 years they could own the world's solid-state industry. Enough people with money (including government agencies) listened, made the gamble, and a decade longer, they owned the industry.

It's probably just a matter of time before the American drug industry goes the same way. Would you invest in something with no payoff for a decade or more, and wasn't even guaranteed to pay off then because nobody had yet created the drugs that might be created? If you guess that few US (or EU) investors will do this, you're likely right.

In particular, the Republican US Congress is highly likely to continue its defunding of academic basic research, partly due to mistrust of investments that won't pay off during their current terms in office, and also due to a serious religion-based dislike of the biological sciences in general. Without the basic research, the only "new" drugs patented by industry will continue to be mostly small tweaks of existing drugs, which under US law qualify as new, patentable products.

Of course, this is all a bunch of tenuous guesses, based on past behavior of the players. That's what investment is usually like. It's entirely possible that they'll wise up, and not abandon the drug industry the way they abandoned the electronics industry. The US does actually have a few solid-state production facilities, after all, though they're now a small part of the market.

But, as the above poster said, would you be willing to gamble your investment money on the hope that US private drug makers will support the research that the US government is getting out of? Remember that, to corporate management, scientific research appears to have a record of 90% failure; i.e., 90% of funded research projects fail to produce a patentable and marketable product. This is the nature of research, which only discovers facts and theories, not products, and where the outcome of a study is unpredictable before the fact. (If it were predictable, it wouldn't be called "research", it'd be "development". ;-)

Slashdot Top Deals

Work without a vision is slavery, Vision without work is a pipe dream, But vision with work is the hope of the world.

Working...