Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Even better news for China (Score 3, Insightful) 97

It doesnt matter if those countries get 100 bucks if 99 of them end up going back to cost of manufacturing.

Part of "cost of manufacturing" is paying workers. There and here. So it does matter. When my $100 goes there instead of here, our economy takes a hit. Tiny, sure, but when it's thousands or tens of thousands or hundreds of thousands of "whatever", then it's no longer a tiny hit.

Comment I was thinking of "high end" in terms of (Score 1) 152

what consumers had access to by walking into a retail computer dealership (there were many independent white box makers at the time) and saying "give me your best."

You're probably right about me underestimating the graphics, though it's hard to remember back that far. I'm thinking 800x600 was much more common. If you could get 1024x768, it was usually interlaced (i.e. "auto-headache") and rare if I remember correctly to be able to get with 24-bit color—S3's first 16-bit capable chips didn't come out until late-1991, if I remember correctly, though I could be off.

SCSI was possible, but almost unheard of as stock, you either had to buy an add-on card and deal with driver/compatibility questions or one of the ESDISCSI bridge boards or similar. Same thing with ethernet, token, or any other dedicated networking hardware and stack. Most systems shipped with a dial-up "faxmodem" at the time, and users were stuck using Winsock on Windows 3.1. It was nontrivial to get it working. Most of the time, there was no real "networking" or "networking" support in the delivered hardware/software platform; faxmodems were largely used for dumb point-to-point connections using dial-up terminal emulator software.

And in the PC space, the higher-end you went, the less you were able to actually use the hardware for anything typical. Unless you were a corporate buyer, you bought your base platform as a whitebox, then added specialized hardware matched with specialized software in a kind of 1:1 correspondence—if you needed to perform task X, you'd buy hardware Y and software Z, and they'd essentially be useful only for task X, or maybe for task X1, X2, and X3, but certainly not much else—the same is even true for memory itself. Don't forget this is pre-Windows95, when most everyone was using Win16 on DOS. We can discuss OS/2, etc., but that again starts to get into the realm of purpose-specific and exotic computing in the PC space. There were, as I understand, a few verrry exotic 486 multiprocessors produced, but I've never even heard of a manufacturer and make/model for these—only the rumor that it was possible—so I doubt they ever made it into sales channels of any kind. My suspicion (correct me if I'm wrong) was that they were engineered for particular clients and particular roles by just one or two orgnaizations, and delivered in very small quantities; I'm not aware of any PC software in 1992 timeframe that was even multiprocessor-aware, or any standard to which it could have been coded. The Pentium processor wasn't introduced until '93 and the Pentium Pro with GTL+ and SMP capabilities didn't arrive until 1995. Even in 1995, most everything was either Win16 or 8- or 16-bit code backward compatible to the PC/XT or earlier, and would remain that way until around the Win98 era.

The UNIX platforms were standardized around SCSI, ethernet, big memory access, high-resolution graphics, and multiprocessing and presented an integrated environment in which a regular developer with a readily available compiler could take advantage of it all without particularly unusual or exotic (for that space) tactics.

Comment Wow, end of an era. (Score 4, Interesting) 152

For more than just a couple of us here, I suspect, there was a time when "Sparc," "UNIX," "graphics," "Internet," and "science" were all nearly synonymous terms.

Simpler times. Boy did that hardware last and last and last in comparison to the hardware of today.

Well, I suppose it can finally no longer be said that the Sparcstation 10 I keep here just for old times' sake can still run "current Linux distributions." But it's still fun to pull it out for people, show them hundreds of megabytes of RAM, 1152x900 24-bit graphics, gigabytes of storage, multiple ethernet channels, and multiple processors, running Firefox happily, and tell them it dates to 1992, when high-end PCs were shipping with mayyybe 16-32GB RAM, a single 486 processor, 640x480x16 graphics, a few dozen megabytes of storage, and no networking.

It helps people to get a handle on how it was possible to develop the internet and do so much of the science that came out of that period—and why even though I don't know every latest hot language, the late '80s/early '90s computer science program that I went to (entirely UNIX-based, all homework done using the CLI, vi, and gcc, emphasis on theory, classic data structures, and variously networked/parallelized environments, with labs of Sparc and 88k hardware all on a massive campus network) seems to have prepared me for today's real-world needs better than the programs they went to, with lots of Dell boxes running Windows-based Java IDEs.

Comment Re:i haven't bought a car in a while... (Score 1) 252

I think the point is price.

Cabs are expensive, but most of the expense is paying the driver. Once you get rid of the worker, it gets a lot cheaper.
Also, with centralized control, routes can be optimized so the taxis are always driving and carying passengers.
It's not slightly cheaper than driving, it can potentially cost an order of magnitude less, and be faster.
Where I live, public transport is easily 5-6 times less expensive than driving, combining bus and cab rides, including the labor cost of drivers.

About keeping it clean, and accountability, we are now used to be identified always. The cars can have cameras, and even require an id for you to ride them (they won't be taking cash, after all). There would be abuse, but it would be close to trivial to punish that kind of thing.

If there's vomit in the car, the car should be able to detect that, and go to a cleaning station. In the event that you do get an unsuitable car, you can just reject it. You could even look at a stream of the inside of the car before it gets to your home.

Also, think about the carpooling possibilities. While people don't like sharing space with strangers, price can change some minds.

Comment As someone that works with data sets all the time, (Score 4, Informative) 144

here are my answers. Spreadsheets are used in several cases:

1) When you have a small-to-medium-sized dataset (100m data points) and want to do a particular set of calculations or draw a particular set of conclusions from it just once or twice—so that the time invested in writing code in R or something similar is less than the time needed just to bung a few formulas into a spreadsheet and get your results. Once you get into analyses or processes that will be repeated many times, it makes more sense to write code.

2) Similar case, when you need to work with essentially tabular database data, but the operations you're performing (basic filtering, extracting records based on one or two criteria, just handing data from one person to the next) are either so simple or will be repeated so rarely that a MySQL database is overkill and just emailing a file back and forth is easier.

3) When you are working with data as a part of a team, and certain members of the team that are specialists in some areas related to the data, or (for example) members of the team that are doing your data collections, aren't particularly computationally expert. Spreadsheets are hard for laymen, but it's doable—a dozen or two hours of training and people can get a general, flexible grasp of spreadsheets and formulae. It takes a lot longer for someone to become basically proficient with R, MATLAB, MySQL, Python, etc., and you really want those specialists to just be able to do what they do to or with the data, rather than focusing their time and effort on learning computational tools. Spreadsheets are general purpose and have a relatively shallow learning curve relative to lots of other technologies, but they enable fairly sophisticated computation to take place—if inefficiently at times. They're like a lowest-common-denominator of data science.

We use Spreadsheets all the time in what we do, mostly as a transient form. The "heavy hitting" and "production" data takes place largely in MySQL and R, but there are constant temporary/intermediate moments in which data is dumped out as a CSV, touches a bunch of hands that are really not MySQL or R capable, and then is returned in updated form to where in normally lives.

Comment Re:Misleading and Hyperbolic Title/Comparison (Score 1) 130

How do you get shell access on your average Mac without physical access? SSH isn't enabled by default as has been pointed out. In fact, it's been a real PITA to get the versions of OS X I've configured to play nice on the network for the command line. I doubt one user in a thousand has done it -- slashdot mac users not being significantly representative of the average mac users, of course. My macs have SSH available, but the port isn't open to the Intertubes outside of my LAN, so it doesn't concern me very much.

So this essentially resolves to a "you have to be there" exploit.

Comment Random data point (Score 1) 174

Bees are all over the place at my home (basically at the center of a small town in rural Montana.) We have quite a few planters full of flowers on our largish deck (about 1000' sq), and it is not uncommon to go out there and see a very large number of bees going about their business. They are nearly zero threat. Well, unless you sit on one. :) We try not to do that.

There are no obvious hives anywhere nearby, and they seem to come and go from all points of the compass.

Sortof-kinda related, there are local honey merchants, and the honey is just lovely.

Slashdot Top Deals

The one day you'd sell your soul for something, souls are a glut.

Working...