Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment I was thinking of "high end" in terms of (Score 1) 152 152

what consumers had access to by walking into a retail computer dealership (there were many independent white box makers at the time) and saying "give me your best."

You're probably right about me underestimating the graphics, though it's hard to remember back that far. I'm thinking 800x600 was much more common. If you could get 1024x768, it was usually interlaced (i.e. "auto-headache") and rare if I remember correctly to be able to get with 24-bit color—S3's first 16-bit capable chips didn't come out until late-1991, if I remember correctly, though I could be off.

SCSI was possible, but almost unheard of as stock, you either had to buy an add-on card and deal with driver/compatibility questions or one of the ESDISCSI bridge boards or similar. Same thing with ethernet, token, or any other dedicated networking hardware and stack. Most systems shipped with a dial-up "faxmodem" at the time, and users were stuck using Winsock on Windows 3.1. It was nontrivial to get it working. Most of the time, there was no real "networking" or "networking" support in the delivered hardware/software platform; faxmodems were largely used for dumb point-to-point connections using dial-up terminal emulator software.

And in the PC space, the higher-end you went, the less you were able to actually use the hardware for anything typical. Unless you were a corporate buyer, you bought your base platform as a whitebox, then added specialized hardware matched with specialized software in a kind of 1:1 correspondence—if you needed to perform task X, you'd buy hardware Y and software Z, and they'd essentially be useful only for task X, or maybe for task X1, X2, and X3, but certainly not much else—the same is even true for memory itself. Don't forget this is pre-Windows95, when most everyone was using Win16 on DOS. We can discuss OS/2, etc., but that again starts to get into the realm of purpose-specific and exotic computing in the PC space. There were, as I understand, a few verrry exotic 486 multiprocessors produced, but I've never even heard of a manufacturer and make/model for these—only the rumor that it was possible—so I doubt they ever made it into sales channels of any kind. My suspicion (correct me if I'm wrong) was that they were engineered for particular clients and particular roles by just one or two orgnaizations, and delivered in very small quantities; I'm not aware of any PC software in 1992 timeframe that was even multiprocessor-aware, or any standard to which it could have been coded. The Pentium processor wasn't introduced until '93 and the Pentium Pro with GTL+ and SMP capabilities didn't arrive until 1995. Even in 1995, most everything was either Win16 or 8- or 16-bit code backward compatible to the PC/XT or earlier, and would remain that way until around the Win98 era.

The UNIX platforms were standardized around SCSI, ethernet, big memory access, high-resolution graphics, and multiprocessing and presented an integrated environment in which a regular developer with a readily available compiler could take advantage of it all without particularly unusual or exotic (for that space) tactics.

Comment So funny, but yeah, totally true. (Score 1) 152 152

The 386 box that I installed Linux on my first time around was 4MB (4x1MB 30-pin SIMMs). 4MB! I mean, holy god, that's tiny. It seemed sooooo big compared to the 640kb of 8-bit PCs, and yet it's basically the same order of magnitude. Not even enough to load a single JPG snapshot from a camera phone these days.

Comment Wow, end of an era. (Score 4, Interesting) 152 152

For more than just a couple of us here, I suspect, there was a time when "Sparc," "UNIX," "graphics," "Internet," and "science" were all nearly synonymous terms.

Simpler times. Boy did that hardware last and last and last in comparison to the hardware of today.

Well, I suppose it can finally no longer be said that the Sparcstation 10 I keep here just for old times' sake can still run "current Linux distributions." But it's still fun to pull it out for people, show them hundreds of megabytes of RAM, 1152x900 24-bit graphics, gigabytes of storage, multiple ethernet channels, and multiple processors, running Firefox happily, and tell them it dates to 1992, when high-end PCs were shipping with mayyybe 16-32GB RAM, a single 486 processor, 640x480x16 graphics, a few dozen megabytes of storage, and no networking.

It helps people to get a handle on how it was possible to develop the internet and do so much of the science that came out of that period—and why even though I don't know every latest hot language, the late '80s/early '90s computer science program that I went to (entirely UNIX-based, all homework done using the CLI, vi, and gcc, emphasis on theory, classic data structures, and variously networked/parallelized environments, with labs of Sparc and 88k hardware all on a massive campus network) seems to have prepared me for today's real-world needs better than the programs they went to, with lots of Dell boxes running Windows-based Java IDEs.

Comment As someone that works with data sets all the time, (Score 4, Informative) 143 143

here are my answers. Spreadsheets are used in several cases:

1) When you have a small-to-medium-sized dataset (100m data points) and want to do a particular set of calculations or draw a particular set of conclusions from it just once or twice—so that the time invested in writing code in R or something similar is less than the time needed just to bung a few formulas into a spreadsheet and get your results. Once you get into analyses or processes that will be repeated many times, it makes more sense to write code.

2) Similar case, when you need to work with essentially tabular database data, but the operations you're performing (basic filtering, extracting records based on one or two criteria, just handing data from one person to the next) are either so simple or will be repeated so rarely that a MySQL database is overkill and just emailing a file back and forth is easier.

3) When you are working with data as a part of a team, and certain members of the team that are specialists in some areas related to the data, or (for example) members of the team that are doing your data collections, aren't particularly computationally expert. Spreadsheets are hard for laymen, but it's doable—a dozen or two hours of training and people can get a general, flexible grasp of spreadsheets and formulae. It takes a lot longer for someone to become basically proficient with R, MATLAB, MySQL, Python, etc., and you really want those specialists to just be able to do what they do to or with the data, rather than focusing their time and effort on learning computational tools. Spreadsheets are general purpose and have a relatively shallow learning curve relative to lots of other technologies, but they enable fairly sophisticated computation to take place—if inefficiently at times. They're like a lowest-common-denominator of data science.

We use Spreadsheets all the time in what we do, mostly as a transient form. The "heavy hitting" and "production" data takes place largely in MySQL and R, but there are constant temporary/intermediate moments in which data is dumped out as a CSV, touches a bunch of hands that are really not MySQL or R capable, and then is returned in updated form to where in normally lives.

Comment How can openness lead to closeness? (Score 3, Insightful) 250 250

Because the number one thing openness generates is chaos and multiple competing claims about reality. Say, many Linux distributions, each claiming to be great, and in fact, many variants of Linux distributions often with many versions and many wrinkles, and many varations of packages, libraries, and so on.

If you want to build or customize things, openness is great. If you just one to pick something up, use it, and move on, a huge amount of confusion, overhead, and pain is involved in trying to pick the "right" version (particularly if you're unfamiliar with openness and wrongheadedly looking for the "real" version, as many early Linux dabblers were) and get it to work quickly and easily.

There is thus a huge amount of value added by anyone that quells the chaos—even in a tiny sphere or product—and that can quickly, clearly, and succinctly explain to users just what their version does, without ambiguity either within itself as an instance or over time. The nature of the beast—this value is the result of "closing the openness," if you will, means that it can't be opened, or the value will be lost.

End users want operating systems and devices that are not open systems with unclear edges that bleed into the ecosystem, but rather a single, coherent, object or product that they can acquire, use in predictable and stable ways, and then lay down once again. They want systems and devices about which books can be written (and bought, and referred to months down the road) without quickly becoming obsolete, and with the minimal risk that this book or that add-on that they purchase will fail to work becuase they'd misconstrued the incredibly subtle differences and variations in product naming, versioning, and so on.

In short, massive openness is incredibly generative and creative, but leaves in place a systems/software/hardware version of the "last mile problem" for computing. Having a fabulous network is one thing, but consumers just want one wire coming into the house, they want it to work, they want it to be predictable and compatible with what they have, and they want to know just where it is and what its properties, limits, and costs are. They are not interested in becoming engineers, the technology they use is only useful to them as a single, tiny, and managable facet of the larger ecosystem that is their life.

This "last mile problem" cannot be solved with openness in hardware or software any more than the last mile problem for wired providers can be solved by opening up all of urban geography to any comers and saying "lay all the cable you want, anywhere you want, to and from any building or system!" First off, it would result in a mess of wires (not un-analagous to what we see across much of free software's development space) and next because most consumers wouldn't be able to make heads or tails of it, much less make a choice, and they'd probably resent the complexity in their backyard and try to do away with it.

Openness leads to closedness because to the extent that openness dominates in the development and engineering space, closedness increases as critical need for carrying whatever is developed to the average consumer space, in precisely the same measure.

Comment You have to build it before they'll come. (Score 2) 654 654

When I lived in NYC, I would have paid 3x what I was paying to use public transit. I could get to anywhere in the city, or at least within ~2-3 blocks of anywhere, quickly, easily, and efficiently—a car would have slowed me down considerably.

On the other hand, living in southern California, I wouldn't have switched to public transit if I'd been offered at $300 a month bonus to do so. It would have meant hours every day walking around on foot in a very pedestrian-unfriendly city.

Two things are required:

1) Very good coverage of the geographical areas of interest, with frequent runs, to minimize time loss walking and waiting
2) Pedestrian-friendly environs when walking and waiting

If you need to spell times and connections out on your route map—i.e. if you need more than just a diagram of where the stations are—then you probably won't see public transit use increase no matter what you do, because your public transit system just won't get the job done.

A good, workable public transit system that doesn't negatively impact lifestyles and livelihood can tell its users everything that needs to be told with a simple, pocketable list of stations (visual or textual) and a poster on the wall of each station listing what's connected there, without any reference to time.

Comment Seriously, batch renaming using a GUI? (Score 1) 267 267

UGH.

Just invest a couple of hours learning about bash and do it on the command line, it will be time VERY will invested as it will save you a LOT of trouble down the road and you will be very surprised at the sheer number of things you can automate by typing commands and just how powerfully and flexibly you can automate them.

Comment Just for fun, I'll point out (Score 1) 217 217

that this level of data collection enables Google Now to serve me very well with updates that I appreciate (traffic and delays along routes that I regularly travel at about this time, events near places I'm likely to be, and so on).

I understand that some prefer that their data not be shared with Google or others. But let's not swing too far the other direction and assume that nobody finds cloud services to be valuable. I, for one, like them very much and am happy to provide Google with as much data about me as possible, in exchange for which it makes my life much, much easier and makes my labor much, much more valuable to me.

It would be one thing if there was no choice about these features, but in fact there is—you are free to disable them and not use them.

Comment Sorry, but you miss the key word. "Dependent." (Score 1) 273 273

This is a "contractor" that has one "contract" with one company, and possesses none of the infrastructure or expertise generally associated with operating an actual "business," meaning that they are, in fact, *dependent* on this *particular* contract and its stability for most or all of their income and livelihood, without the ability to easily find or draw others, particularly if they are to avoid putting this primary and essential (for them) contract at risk.

In fact, there is a word for this relationship already: employment.

What is being suggested here could just as easily be rephrased as "it's time for a new kind of relationship, 'employee without benefits that is not subject to federal labor laws.'"

Comment Also continuous lipstick application. (Score 2) 288 288

I've worked for two companies where "agile" methodology applied company-wide meant point releases every one or two weeks and minor UI changes with every point release to "get better with each version." This floated mgmt's boat and kept the UX/UI people busy and excited, but it was a nightmare for customer support and (evidently, by extension) for customers who could never quite feel as though they'd "learned" to use the software.

Every time they logged in they struggled to figure out how to repeat the workflows they'd struggled to get ahold of the previous time. Of course, the widgets, labels, views, etc. tended to change between logins. Kind of like a maze with moving walls.

I argued for UI changes to be batched for major versions, but this supposedly wasn't "agile."

Comment This is what I do now, too. (Score 1) 184 184

I state up front that I work on my own terms. I have talent to offer and can solve problems that others often can't, but I place a premium on flexibility and on my own health and family. I am incredibly productive, more than many other employees, but I do not offer *maximum productivity*, i.e. "as much as I am humanly able to produce." Even if it seems that I have more to offer (i.e. I leave at 6:30 when everyone else is still working and Skyping me at 11:30 pm, I travel a only couple of times per year and decline to travel 20 times per year, etc.), I am not willing to give this "more" to the organization—it is for my family and my own personal growth.

And both of the phrases I used are things I've been told—"We have doubts about your how serious you are; we're interested in someone that's more serious about their career" and "We don't doubt that you're highly skilled and productive, your resume and recommendations are stellar, but we're in a competitive industry and we need highly competitive people, and we're not sure you've got that competitive fire in your belly—that you're really going to be one hundred percent invested in the company and its growth."

I have two friends that have been on the serial startup carousel as founders. Both burned out and moved in other directions because they felt it was impossible to actually have a life, be a human being, and get growth and operating capital support from investors. Each startup became their entire lives each time until positive exit, and at some point each said, "I'm not doing this again, I'm losing my own sense of identity and my family."

And if you take that kind of statement out into the public sphere, I'd bet that what others would say is, "Well, they weren't really made to be enterpreneurs, then; they were destined to burn out because it's not the lifestyle for them."

Which is precisely my point—and it sounds like you've seen it, too—there's a prevailing "wisdom" that "real" career builders or "real" enterpreneurs are a particular "type"—the type that gives every . last . drop . of . blood to the company. The rest? They're just not "cut out for it"—they should "do something else."

Of course, if you're not "cut out" for the job market or for enterpreneurship, it's not quite clear what "else" you ought to be doing to earn a living. There are only so many jobs at nonprofits and in government agencies.

It would be better if society were to take a step back and assume the opposite—that everyone is basically loyal, driven, and productive, but in general, a healthy person cannot exist without healthy hours, life balance, and relationships, and if someone is the "type" to be working from 4:00 am until midnight every day of the week, and double that on holidays to pick up the slack, the are probably in need of counseling or personal development, rather than a raise and a promotion. But I suppose that's not how the market works.

1.79 x 10^12 furlongs per fortnight -- it's not just a good idea, it's the law!

Working...