Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment From the KDE 1.0b3 Announcement: (Score 2) 141

"Supported platforms: KDE was primarily developed under the GNU/Linux variant of the Unix operating system. However it is known to compile without, or with very few, problems on most Unix variants. At the moment we explicitely support GNU/Linux (Intel , Alpha, Sparc) and Solaris (Sparc) and we have success reports for..."

There are a lot of people here that clearly weren't deeply involved either in serious (non-home) computing or Linux during the era in which Linux was introduced and had its biggest impact.

Lots of "but GNU..." or "but all of these things aren't Linux-only..."

This commentary on KDE is an example. Sure, it eventually supported more platforms. But that doesn't change the fact that as an early OSS project, it was possible—that is to say, the developers that became involved were able to become involved in the first place—only because Linux had become available and broadly accessible.

Without Linux—if people wanting to do GUI development had been limited to DOS/Win3 or Mac OS on the low end, or SunOS/AIX/HPUX/etc. on the high end, in other words—KDE would simply not have happened. First, all of these systems came with their own DE that was vendor-supported and "good enough" while Linux users were stuck with TWM/FVWM or commercial CDE ports, and next, GUI development was either prohibitively complex and specialized or prohibitively expensive on these other platforms.

Saying that the developers at the start used whatever free OS they could find does not change the fact that the free OS that they did, in fact, find was Linux and that's how many of them came into the flow. *BSD had been around for a very long time prior to Linux, yet the Unix world had remained the rarefied and very expensive Unix world with very little of note going on in the middleware level—it was all vendors building systems and departments (academic or enterprise) implementing specific application flows. It was highly vertical and highly proprietary.

Linux enters the scene and in half a decade we have multiple entirely new integrated DEs for Unices, rapidly expanding driver support for almost all commodity hardware, and businesses and schools in every direction running Unix instead of DOS/MacOS. The barriers to entry in computing, information systems, and research design and development of all kinds went from extremely high to almost none, almost overnight—in one cohort of college students, essentially.

Linux opened Unix and networking up and turned them into the global ecosystems that they are today. Saying that this would have been *technically* possible without Linux is not at all good support for the claim that it would have been *likely* at the social (i.e. in actual society) level. Linux changed the game entirely, brought TCP/IP, OSS, and what was once called "high performance computing" (now it's just basic "computing" to raytrace a widget, compress a data stream, or manage multi-gigabyte database) to the public. Before Linux, all of these were exotic and expensive and economies of scale not only didn't apply but in fact couldn't. Now, decades after Linux, they clearly seem very pedestrian to many and economies of scale mean that you can carry them around in your pocket.

It's an eye-opener to read this Slashdot discussion and see so many that don't actually understand or know this.

It makes me think that the time may be ripe for a historical work or historical wiki on Linux/OSS history and its relationship to the broader Internet and information society of the present.

Comment Exactly. (Score 4, Interesting) 141

You can tell whether or not someone was actually there by whether or not they mention things like "Minix" in a list of viable operating systems.

I was part of a project at the time that needed real networking and a real Unix development environment. We spent four months working to find an alternative, then shelled out for a series of early Sparc pizza boxes. SS2 boxes maybe? As I recall, we got four at nearly $15k each that ate up a huge chunk of our budget.

Two years later, we had liquidated them and were doing all of the same stuff on Linux with cheap 486 boxes and commodity hardware, and using the GNU userland and toolchain. People here talk about GNU as predating Linux while forgetting that prior to Linux, the only place to run it was on your freaking Sparcstation (or equivalent—but certainly not under Minix), which already came with a vendor-supported userland. GNU starts to be interesting exactly when Linux becomes viable.

All in all, the change was bizarrely cool and amazing. We were like kids in a candy store—computing was suddenly so cheap as to almost be free, rather than the single most expensive non-labor cost in a project.

Comment No. Linux has more relevance, (Score 2) 141

just far less visibility.

The Internet runs on Linux. The number of routers, firewalls/filters, and networking devices and network-connected appliances of all kinds that are Linux-based is staggering. Android is Linux. Every major commercial operating system has either learned/copped or borrowed code from Linux. The supercomputing world is totally pwned by Linux in every way. The practical work of virtually all of science these days relies on Linux.

Linux is freaking HUGE for our world.

On the desktop, however, Linux has been neglected, because designing consumer UX is a very different skill from the skillset that most of the OSS developer world brings to bear. It's too bad—when KDE 1.0 was released, it was obvious to anyone looking that Linux was the future of desktop computing—and yet in many ways the Linux desktop is worse than it has ever been from a consumer usability standpoint.

But don't mistake "not visible on desktops at home or at work" from "not relevant."

Comment Git? When Linux hit the scene, (Score 5, Informative) 141

there _was_ no free operating system for industry standard hardware, much less a Unix-like one, and the commercial offerings were all platform-specific.

If you wanted a real computer that could do real stuff (as opposed to a DOS box, which wasn't even network aware in any substantive way, and even in non-substantive ways required $$$ for bare-bones, single-function software tools that were cobbled together out of batch files and nonsense), you had to:

- Get your hands on dedicated Unix workstation hardware, which was often poorly documented/supported outside of a corporate sales account

- This meant either $tens of thousands for current workstation hardware or $thousands for last-cycle hardware if it was even available at all (university and government surplus lots were the primary suspects)

- Phone up the one or two providers that offered OSes for the system

- Shell out $many thousands for a license (and often $thousands more for media)

- In many cases, because non-current hardware was tied to non-current OSes no longer for sale, port the current tree yourself to the non-current hardware after spending the $thousands you spent for a license

In short, it was substantively impossible for—say—a small company, a startup, or a CS/CE student to get their hands on anything beyond a DOS box with Windows 3 on it. With money and time, they MIGHT get web BROWSING working on Windows 3—in unstable ways. Developing software was a nightmare on these DOS/Win3 boxes as well—compilers were expensive, proprietary, and often required runtimes that had to be licensed on a per-user basis (i.e. you spent $200 on the compiler that spoke a non-standard dialect, then if you wanted to sell what you created, you spent another $some amount per copy sold) and that had no hooks for anything network-ish, because there were no standards in the DOS ecosystem for that.

Linux changed everything. Suddenly, you could pick up commodity i386 hardware and actually do network stuff with it in Unix-y ways. Even in the early days when Linux was unstable, incomplete, and a bear to install/configure, it made things possible for small shops or independent developers/creators that had simply been prohibitive in every practical way just a year earlier.

As a result, the Unix networking ways—thanks in many ways directly to Linux—would eventually become the industry standard form of networking (TCP/IP over ethernet) that we take for granted today—but in no way was history certain to end up this way. We could just have well been tossing the equivalent of glorified FidoNet payloads today.

Without Linux, GNU, and BSD, it's no stretch to say that we may not have had an Internet today in any way that we'd recognize, and certainly Linux has been the most visible and most widely distributed amongst the three.

Much more than the work by Berners-Lee, Linus Torvalds invented the future that we live in.

Comment Mix. (Score 1) 264

Two Seagate 2TB, upon which we switched loyalties, then two WD Green 2TB.

The Seagates both hand spindle/motor problems of some kind—they didn't come back up one day after a shutdown for a hardware upgrade. The WD Green 2TB both developed data integrity issues while spinning and ultimately suffered SMART-reported failures and lost data (we had backups). One was still partially readable, the other couldn't be mounted at all.

Is there some kind of curse surrounding 2TB drives?

Comment Not in my experience. (Score 5, Interesting) 264

Anecdotal and small sample size caveats aside, I've had 4 (of 15) mechanical drives fail in my small business over the last two years and 0 (of 8) SSDs over the same time period fail on me.

The oldest mechanical drive that failed was around 2 years old. The oldest SSD currently in service is over 4 years old.

More to the point, the SSDs are all in laptops, getting jostled, bumped around, used at odd angles, and subject to routine temperature fluctuations. The mechanical drives were all case-mounted, stationary, and with adequate cooling.

This isn't enough to base an industry report on, but certainly my experience doesn't bear out the common idea that SSDs are catastrophically unreliable in comparison to mechanical drives.

Comment Seriously, this. (Score 0, Flamebait) 297

The "freedom brigade" these days has gone around the bend. What, I can't fly unmanned payloads into a building? I can't drop heavy solid objects from the air over pedestrians? BIG BROTHER! BIG BROTHER!

What's next? "You mean I can't bludgeon you to death with my garden shovel? This is all Obama's fault, the damned communist!"

Comment Using DD-WRT (Kong latest "old" driver version) (Score 1) 104

on a Netgear R6300 and it has been very fast, great with signal quality, and the QoS features are working as expected.

Both the R6250 and R6300 have a dual-core 800MHz CPU, so they have the power to handle a decent QoS requirement without bogging down potential throughput too much. I'm satisfied, and it wasn't that expensive. If your situation isn't too terribly complex (many dozens of users and extensive QoS rules) then it might be a good choice.

The R7000 is even faster and supports external antennas, so I second that suggestion, but it's also twice the price of the 6250/3000, which can be found on sale from $100-$125 brand new if you're a good comparison shopper and/or patient.

Comment I think you're missing the point (your "not into (Score 0) 245

FPS" comment at the end is evidence of this).

In the PC gaming world, getting it to run at the highest settings *is* the game. It's like the "bouncing ball" graphics demos on 8-bit systems in the 1980s. The actual software isn't useful or meant to occupy the user's attention for long. The challenge is in *getting it to run* and the joy is in *seeing what my super-cool computer is capable of* in processing and graphics rendering terms.

Running on last year's card/settings? Sorry, you don't get the game.

This is why I stopped being a PC gamer in the late '90s. All I wanted was a better Tetris. What I got was a better bouncing ball demo.

Comment It's early days yet. (Score 1) 180

There were a whole bunch of smartphones before the iPhone. Anyone remember them? I stumbled across my old Palm Centro the other day, which replaced a Treo 680. These devices were useful to some (I was one of them), but the cost/benefit calculation was finicky, and they didn't find widespread adoption.

Pop consensus was that smartphones were a niche market. Then, someone got one right (iPhone) and the whole industry took off. These days, people don't even realize they're using a "smartphone" (I can remember the early press using the term "supersmartphone") because it's just "my phone."

The same trajectory outlines the computing era in general—from 8-bit boxes that were fiddly and full of cables and user manuals and coding to the Windows era during the '90s—at first, it was a geek thing, and lots of people got in and then got out, deciding it wasn't useful. Then, suddenly, a few UX tweaks and it was ubiquitous and transparent and a market we couldn't imagine the world being without.

I suspect the same will happen with wearable tech.

Slashdot Top Deals

If the aborigine drafted an IQ test, all of Western civilization would presumably flunk it. -- Stanley Garn

Working...