Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment High-priced is one key. (Score 1) 469

Sure, the hardware was pretty cool, but the prices were just really, really high.

I remember a call to Sun where I was asking about a SunOS license for a used 3/80 I was thinking of buying from the department, and it was going to run me like $3,000 for the OS or $5,000 with development environment or something along those lines, and it was quoted to me as the *academic* price, and no matter which media delivery I wanted I would have had to buy additional hardware to read it, and so on. I mean, that's $5,000 to $8,000+ in today's cash for an already old, low-end Unix system at the time. I remember pleading with the person on the other end of the line to help me brainstorm and find other options, as I was a CS student and needed to be able to do my homework at home and all of that, but of course, they just felt like I was tying up the line—I looked sort of ridiculous from their perspective.

And then they told me that it was really too old to be useful, that I wanted an IPC or an IPX, I forget which, and the prices were well into five digits, again *academic* price for a bare bones configuration. And here I am a broke CS undergrad already struggling to pay $4k/year in tuition to a state school. It was a total non-starter.

Meanwhile, the first purpose-specific Linux box that I assembled (as my Linux excitement grew and I knew I wanted to do a dedicated build) was a 386/40 with 8MB RAM and about 1GB ESDI storage, along with a Tseng ET4000 VGA card. It was all used gear, again bought in surplus channels, but the thing was that it got me beyond what the 3/80 would have provided in terms of performance, and was perfectly servicable at the time as an X+development box, and it cost me a total of like $200. That was just plausible.

I built a sync converter circuit to connect an old Tektronix 19" fixed-sync color monitor to the VGA port and felt like I had a real, honest-to-god Unix workstation for $300, with a very competent development environment and Emacs, NCSA Mosaic, etc., rather than having had to spend 10x that much for less in the end.

People talk about *BSD, but the driver support on *BSD wasn't as good even a few years later when I looked at it. Linux supported crap hardware in addition to great hardware, which sounds bad until you realize it means that any broke student could scrounge around in the boardbucket and put together a fairly decent Linux system for peanuts.

Comment To each his own (servers). (Score 1) 469

I didn't have access to that tier of internet service at home at the time—I had UUCP and a Fido tosser, both of which dialed out over a modem once a day for a multi-megabyte sync.

Sun3/4, HP900, and some DEC boxes were all in use in the CS labs at my school at that period, though they were getting old. So I had ftp access there, but there was no removable storage on those machines, and no direct way to access them from my dial-up, and my account quota was pretty low on the NFS server, just enough to hold a bunch of C code for class and the binaries it produced, so space was pretty precious. For those reasons, it never occurred to me to ftp or gopher around at school for software to figure out how to take home. Plus lab access was limited and you really wanted to spend your time there doing your homework problems and getting them to compile and run, not dinking around looking for freeware.

Minix I read about on Usenet and tracked down some binaries, IIRC. But it wasn't impressive. Every now and then I'd hear something about *BSD, but it really wasn't clear how to get my hands on it, and nobody could really tell me. The people "in the know" as far as I knew were in the CS department and they were using the commercial unices and knew those pretty well.

In the Fido and Usenet groups I was in, Linux was the thing that turned up often and easily. Maybe I was reading the wrong groups, who knows. There were a hell of a lot of them in those days, if you recall, and it was all pretty noisy.

But Linux seemed then and over the several years that followed to come up over and over again. Others—not so much. It is what it is.

Comment Single case anecdote. (Score 4, Interesting) 469

I had been trying to afford a Unix installation at home as a CS student. All I knew was the Unix vendors. I was not aware of the social structure of the Unix world, various distributions, etc. I was crawling university surplus lots and calling Sun and DEC on the phone to try to find a complete package that I could afford (hardware + license and media). Nothing was affordable.

I was also a heavy BBS and UUCP user at the time over a dial-up line. One day, I found an upload from someone described as "free Unix." It was Linux.

I downloaded it, installed it on the 80386 hardware I was already using, and the rest is history. This was 1993.

So in my case at least, Linux became the OS of choice becuase it had traveled in ways that the other free Unices didn't. It was simply available somewhere where I was.

This isn't an explanation for why Linux ended up there instead of some other free *nix, of course, but by way of explaining the social diffusion of the actual files, I saw Linux distros as floppy disks around on BBSs and newsgroups for several years, with no hint of the others.

For someone with limited network access (by today's standards), this meant that Linux was the obvious choice.

As to why Linux was there and not the others—perhaps packaging and ease of installation had something to do with it? Without much effort, I recognized that the disks were floppy images and wrote out a floppy set. Booted from the first one, and followed my nose. There was no documentation required, and it Just Worked, at least as much as any bare-bones, home-grown CLI *nix clone could be said to Just Work.

I had supported hardware, as it turned out, but then Linux did tend to support the most common commodity hardware at the time.

My hunch is that Linux succeeded because it happened to have the right drivers (developed for what people had in their home PCs, rather than what a university lab might happen to have), and the right packaging (an end-user-oriented install that made it a simple, step-by-step, floppy-by-floppy process to get it up) while the other free *nix systems were less able to go from nothing to system without help and without additional hardware for most home and tiny lab users.

For comparison, I tried Minix around the same time (I can't remember if it was before or after) and struggled mightily just to get it installed, before questions of its capabilities were even at issue. I remember my first Linux install having taken an hour or two, and I was able to get X up and running the same day. It took me much longer to get the disks downloaded and written. Minix, by comparison, took about a week of evenings, and at the end, I was disappointed with the result.

Comment Re:Rely on the counterfactual. (Score 1) 211

Yes, in practice it's usually a mix of the two, so the principle is more an abstract model than an argument about real, concrete thresholding.

But the general idea is that by the time someone stops being promoted, if they continue in the job that they are in while not being promoted for an extended period of time, it means that they are likely not amongst the highest-merit individuals around for that particular job and responsibility list—because if they were, they'd have been promoted and/or would have moved to another job elsewhere that offered an equivalent to a promotion.

Comment Rely on the counterfactual. (Score 5, Informative) 211

The best way to understand the principle is to imagine the counterfactual.

When does a person *not* get promoted any longer? When they are not actually that great at the position into which they have most recently been promoted. At that point, they do not demonstrate enough merit to earn the next obvious promotion.

So, the cadence goes:

Demonstrates mastery of title A, promoted to title B.
Demonstrates mastery of title B, promoted to title C.
Demonstrates mastery of title C, promoted to title D.

Does not manage to demonstrate mastery of D = is not promoted and stays at that level indefinitely as "merely adequate" or "maybe next year" or "still has a lot to learn."

That's the principle in a nutshell—when you're actually good at your job, you get promoted out of it. When you're average at your job, you stay there for a long time.

Comment You are describing engineering on public works (Score 1) 634

projects and in academic research, which are an already tiny, extremely competitive, and ever-smaller part of the general pool of engineering labor.

Most of what engineers do is in the broader consumer economy, engineering objects, systems, etc. for people that are already amongst the world's wealthy (i.e. consumers in the largest national economies), that they don't really need, to enrich still wealthier people that don't really need any more enrichment.

I may be a woman underneath it all, because despite being gainfully employed in a high-skill position that makes use of my Ph.D., I can't stand my job, which is all about making stuff with little direct bearing on daily life to help make rich people even richer—yet of course it is taken deadly seriously by everyone in the company, and there is a general disdain for and scoffing at "causes," like say, preventing climate change, expanding human knowledge and capability, or helping to address the massive wealth inequality on the planet.

Comment He's right, but the conclusion may require nuance. (Score 2) 145

Here's the thing—we may not actually want every otherwise unmotivated late teen to be sitting dubiously through college courses just because it's either that or go back to their dorm and twiddle their thumbs. Some things:

- There is an oversupply of graduates these days in most fields and at most levels
- A dawdle-dawdle unmotivated student is not doing their highest quality learning
- Even students that will eventually use what they learn may not do so for years
- In the meantime, what they learned is getting very rusty between learning and use

So with these things said, *how about* a model in which:

- People are not motivated to learn something until they need to
- Once they need to, they are happy to blast through it intensely
- And they will put it to use right away
- And their motivation comes from needs (for a raise, to be competitive, etc.)

I would think this would help to mitigate some of the particular supply/demand problems on all sides (for an education/for students/for graduates as employees).

The one caveat, and it's an important one, is that we do of course want people to be generally mature, thoughtful, capable, and culturally literate if they are goint to be participating in society, and right now high schools are failing utterly at even touching these points.

So to address that need, let's just require a minimal level of "general" college-level education, say a one-year or two-year degree that as no "major" or "minor" selections and issues no grades, but certifies literacy about politics/citizenship, social science (particularly social problems), national culture, basic quantitative reasoning, and so on—enough to become a careful thinker and to better understand "how to learn stuff."

This general education certification would be required in order to:

- Vote
- Get a business license
- Sit on a corporate board

But would be disconnected from particular vocational or other subject-oriented learning issued via, say, MOOCS as well as face-to-face alternatives. And instead of a major in a single discpline, outcomes from MOOC courses could be used to calculate a nationally databased and relatively involved (many measures) "bar chart" for each student, that tallied their experience and competence with particular subject areas, expressed quantitatively as a figure without an upper bound, that is added to with each additional course, and perhaps incorporating quantitative feedback about their performance from employers as well:

So instead of wanting someone with 4-year degree and a "major" in computer science, employers could seek someone with their general education certification along with "at least a 1400 in OS design, a 650 in Java, and a 950 in medical organizations and systems" and so on.

Over the course of a lifetime, scores in any particular area could continue to increase, either by taking additional MOOCs to get more exposure, or by having employers report on accumulated skills and experience to the system.

So that someone that took only a few courses in X in school, but in the real world and on the job, became—over 20 years—the best X in the country, would have this gradually reflected in their national education/experience scores as the years of experience and successes mounted.

Meanwhile, we'd also no longer have the weird mismatches that come when an employee has a degree in Y but actually works in Z, and then has to explain this in various ways to various parties. First of all, at the level of the 1-or-2-year general education, they would no longer gret a "degree in" Y. That would be handed by MOOCs and represented in varous numbers that increased as the result of completing them.

But if someone did do an about-face and choose an entirely different subject or work area in life, this would also gradually be reflected in their education/experience scores. We'd know when someone who'd studied chemistry in their '20s finally became a "real biologist" because their scores in biological areas would begin to overtake their scores in chemical areas, and so on.

Of course none of this is plausible because social systems simply don't work this way. But (after a long digression) getting back to the point, it's not all that bad that a MOOC isn't the same as forcing someone in their late teens or early '20s to sit in a classroom and drag their feet through a BA/BS.

Comment A social scientist translating for them (Score 2, Informative) 442

What they're trying to say, using the usual feminist sociology over-loquatiousness is:

For some on the planet, keeping it under 2 degrees will preserve a relatively familiar or at least acceptable quality of life.

For others on the planet, quality of life can only be preserved by keeping it under, say 1.5 degrees, or even one degree.

The first group (that can live with a higher threshold) are those in the upper portions of the global economic scale, and it's an acceptable rise for them because they can also afford technologies and tools (getting crude, say, air conditioners, new home materials, new kinds of agricultural output, etc.) that make a 2 degree rise tolerable.

The second group (that can't live at the 2 degree threshold, and really need a lower one) are going to tend to be in the lower portions of the global economic scale, who won't have access to the technologies and tools that make a 2 degree rise livable for those at the top of the scale.

Policymakers and scientists tend, by virtue of their privileged position, to be in the first group, and have thus set the 2 degree rise in connection with thinking of their own, best-case lifestyles, rather than—say—a member of one of the globe's largely impoverished equatorial populations without access to much in the way of resources, tools, or technologies already.

It's a good point: the effects are not uniform, and if 2 degrees is the upper bound for the people who are the globe's *most* comfortable, then it's probably a bad upper bound in general, because it will "cook" (even more than already occurs) those that are the *least* comfortable.

It was, however, bad language and clarity—which is a sin that social science commits far too often.

Their point is well taken:

Comment I'll join the chorus: Mac. (Score 1) 385

Get a nicely configured MBP and be done with it.

It's the most common platform in research and academic settings for individual use these days, which means that there is a social dimension to the available support (i.e. people around you can help with problems). Meanwhile, the platform is narrow enough and the OS and hardware tightly bound together enough that one-off bugs and edge cases are exceedingly rare (which is not the case for Linux).

And Apple has very reasonable quality control in both hardware and software.

Having done a Ph.D. and dealt with the pressures and complexities that come therewith, I'd say that the overriding concerns should be reducing the PITA factor, keeping downtimes short, eliminating unexpected behavior and gotchas to whatever extent possible, and buying in to the largest on-the-ground support network (i.e. installed customer base) that you can find with identical hardware/software.

All of these things point to Mac for academic research settings.

Comment You're right and wrong. (Score 2) 320

You're absolutely right about incentives and grant money.

How you tied this to the Nobel Prize is beyond me, so let's drop that.

The incentives are all about grant money and outside (the campus) capital. As a result, the science takes a back seat to market economics, market-ing (both of corporate partners and of academic institutions themselves, which increasingly operate in a competitive marketplace for enrollments), management concerns, investors, etc.

This incentive structure is increasingly becoming the norm well beyond U.S. shores.

So the problem isn't that science is increasingly wrong, it's that scientists are increasingly doing labor that may *involve* science, but that is in fact product-oriented R&D driven by short-term investment timelines and economic and investor-friendly optics, and whether any of it is good *science* is secondary or tertiary to whether it's profitable, whether directly or indirectly.

Let the scientists go back to doing science first and money-making (whether to support their own tenure lines or to support corporate profits) second or even better, third, fourth, or fifth, and you'll find that the ship rights itself.

Comment Mr. Moynihan should have read on the (Score 1) 375

problems of epistemology, including in science.

Note that there are no shortage of facts whose veracity depends on nuanced facets of context and condition, some of which are disputed.

For example, fact or not: "Linux is a difficult operating system to use, and is a better choice for geeks and hackers than for regular users."

Or how about:

"Android is an operating system written by Google."

Or how about:

"The Bermuda Triangle region has seen an unusually high number of ship and plane disappearances over the years, and may be a particularly dangerous place to travel."

Because unless Google's algorithms are very, very nuanced in their approach, each of these is going to be seen as carrying high levels of factuality based on the preponderance of content out there, particularity in high-authority sources.

Of course, statements like the first and third are too complex for Google's rankings to evaluate and rank, and it can only work with very simple assertions on the order of "Milk is white," or "Obama is a Democrat," the it's going to do practically nothing (good or bad) at all for the rankings, since facts with this level of consensus are generally undisputed, even by those that promote falsehoods.

Comment This shifts the weakness in Google's rankings (Score 3, Interesting) 375

from gameability (in short, SPAM) to politics. Rather than punish above-board or non-predatory websites, it will punish both subversive and innovative thought that runs well ahead of social consensus. Sure, it will also eliminate willful misinformation, but it turns Google into an inherently conservative, rather than socially innovative, force.

Can't say I think it's better. Probably not any worse, but certainly not panacea.

Comment Sociological problem: CYA (Score 5, Insightful) 158

Part of the problem is the CYA issue.

If you're writing the code, you sound like a laborer ("I have to..."). If it breaks, it's your fault and you're on the hook publicly.

If you present a third-party component in a meeting, you sound like a manager ("I propose that we..."). Once three or four other people in the meeting have concurred, if something breaks it's the third party's fault. A ticket or two are initiated, it's someone else's problem and everybody gets to cast blame somewhere beyond the walls of the company.

Rational behavior, regrettably.

Comment You're absolutely right. The desktop is over. (Score 1) 393

I have no idea why people are arguing with you about this. The evidence (not least from the desktop computing industry) is everywhere, with catastrophically declining sales over the long term, offset by increases in mobiles and tablets—which, incidentally, Linux has already won, though in large part by leaving the distro community behind.

Linux could actually conquer the desktop in the end—a few years down the road when desktop computing is a specialized, professionals-only computing space. The users of other desktop operating systems are slowly bleeding off to mobile and tablet.

But this can only happen, ironically, if distros and devs stop trying to conquer the desktop in the present. If they continue down the path they're on, the long-term desktop community, which would be a natural fit for the Linux of yore, will probably be on some other OS. (MacOS? Surely not Windows at this point.)

Slashdot Top Deals

The rule on staying alive as a program manager is to give 'em a number or give 'em a date, but never give 'em both at once.

Working...