Yes. Switching computers on my KVM switch (scroll lock + console number)
I'd have to question that (not that she has to do it or the reasons she was told, but the supposed reality that elders can't read normal text as well as caps). One of the pieces of research that was done here in the 1950s resulted in motorway road signs in the UK being in mixed case rather than all caps - it caused howls of anguish from old-timers resistant to change - but the thing is words with lower case text have more of a shape - for instance "Manchester" can be resolved as the word "Manchester" much faster than "MANCHESTER" - it was found you could read the mixed case before you could even resolve all the letters because you could recognise the shape of the word, given that lower case text has more features like ascenders and descenders. Hence all UK road signs ever since have been in mixed case.
Drones are subject to the same rules that RC aircraft are subject to.
It is however extremely hard to enforce. RC users are generally pretty responsible - they've probably spent many hours building their aircraft, and during this time it has sunk in the dangers they can pose, and usually they've joined a local club to help them learn to fly their new expensive aircraft and the club will also coach them on safely operating their aircraft.
Drone users not so much. Many of the ready-to-fly drones require pretty much zero skill to operate, so people can take off and cause mischief pretty much straight away.
Or put simply: extraordinary claims require extraordinary evidence. We have an extraordinary claim here - highly extraordinary - but the evidence falls very very far short of being even just ordinary, let alone extraordinary.
I didn't know that, though based on what I do know about the jpeg format, it makes a kind of sense that this would be possible. Thanks for posting this, great nugget of information!
what consumers had access to by walking into a retail computer dealership (there were many independent white box makers at the time) and saying "give me your best."
You're probably right about me underestimating the graphics, though it's hard to remember back that far. I'm thinking 800x600 was much more common. If you could get 1024x768, it was usually interlaced (i.e. "auto-headache") and rare if I remember correctly to be able to get with 24-bit color—S3's first 16-bit capable chips didn't come out until late-1991, if I remember correctly, though I could be off.
SCSI was possible, but almost unheard of as stock, you either had to buy an add-on card and deal with driver/compatibility questions or one of the ESDISCSI bridge boards or similar. Same thing with ethernet, token, or any other dedicated networking hardware and stack. Most systems shipped with a dial-up "faxmodem" at the time, and users were stuck using Winsock on Windows 3.1. It was nontrivial to get it working. Most of the time, there was no real "networking" or "networking" support in the delivered hardware/software platform; faxmodems were largely used for dumb point-to-point connections using dial-up terminal emulator software.
And in the PC space, the higher-end you went, the less you were able to actually use the hardware for anything typical. Unless you were a corporate buyer, you bought your base platform as a whitebox, then added specialized hardware matched with specialized software in a kind of 1:1 correspondence—if you needed to perform task X, you'd buy hardware Y and software Z, and they'd essentially be useful only for task X, or maybe for task X1, X2, and X3, but certainly not much else—the same is even true for memory itself. Don't forget this is pre-Windows95, when most everyone was using Win16 on DOS. We can discuss OS/2, etc., but that again starts to get into the realm of purpose-specific and exotic computing in the PC space. There were, as I understand, a few verrry exotic 486 multiprocessors produced, but I've never even heard of a manufacturer and make/model for these—only the rumor that it was possible—so I doubt they ever made it into sales channels of any kind. My suspicion (correct me if I'm wrong) was that they were engineered for particular clients and particular roles by just one or two orgnaizations, and delivered in very small quantities; I'm not aware of any PC software in 1992 timeframe that was even multiprocessor-aware, or any standard to which it could have been coded. The Pentium processor wasn't introduced until '93 and the Pentium Pro with GTL+ and SMP capabilities didn't arrive until 1995. Even in 1995, most everything was either Win16 or 8- or 16-bit code backward compatible to the PC/XT or earlier, and would remain that way until around the Win98 era.
The UNIX platforms were standardized around SCSI, ethernet, big memory access, high-resolution graphics, and multiprocessing and presented an integrated environment in which a regular developer with a readily available compiler could take advantage of it all without particularly unusual or exotic (for that space) tactics.
The 386 box that I installed Linux on my first time around was 4MB (4x1MB 30-pin SIMMs). 4MB! I mean, holy god, that's tiny. It seemed sooooo big compared to the 640kb of 8-bit PCs, and yet it's basically the same order of magnitude. Not even enough to load a single JPG snapshot from a camera phone these days.
used that abbreviation that it just doesn't roll off the fingers any longer.
For more than just a couple of us here, I suspect, there was a time when "Sparc," "UNIX," "graphics," "Internet," and "science" were all nearly synonymous terms.
Simpler times. Boy did that hardware last and last and last in comparison to the hardware of today.
Well, I suppose it can finally no longer be said that the Sparcstation 10 I keep here just for old times' sake can still run "current Linux distributions." But it's still fun to pull it out for people, show them hundreds of megabytes of RAM, 1152x900 24-bit graphics, gigabytes of storage, multiple ethernet channels, and multiple processors, running Firefox happily, and tell them it dates to 1992, when high-end PCs were shipping with mayyybe 16-32GB RAM, a single 486 processor, 640x480x16 graphics, a few dozen megabytes of storage, and no networking.
It helps people to get a handle on how it was possible to develop the internet and do so much of the science that came out of that period—and why even though I don't know every latest hot language, the late '80s/early '90s computer science program that I went to (entirely UNIX-based, all homework done using the CLI, vi, and gcc, emphasis on theory, classic data structures, and variously networked/parallelized environments, with labs of Sparc and 88k hardware all on a massive campus network) seems to have prepared me for today's real-world needs better than the programs they went to, with lots of Dell boxes running Windows-based Java IDEs.
100m data points *or less*
here are my answers. Spreadsheets are used in several cases:
1) When you have a small-to-medium-sized dataset (100m data points) and want to do a particular set of calculations or draw a particular set of conclusions from it just once or twice—so that the time invested in writing code in R or something similar is less than the time needed just to bung a few formulas into a spreadsheet and get your results. Once you get into analyses or processes that will be repeated many times, it makes more sense to write code.
2) Similar case, when you need to work with essentially tabular database data, but the operations you're performing (basic filtering, extracting records based on one or two criteria, just handing data from one person to the next) are either so simple or will be repeated so rarely that a MySQL database is overkill and just emailing a file back and forth is easier.
3) When you are working with data as a part of a team, and certain members of the team that are specialists in some areas related to the data, or (for example) members of the team that are doing your data collections, aren't particularly computationally expert. Spreadsheets are hard for laymen, but it's doable—a dozen or two hours of training and people can get a general, flexible grasp of spreadsheets and formulae. It takes a lot longer for someone to become basically proficient with R, MATLAB, MySQL, Python, etc., and you really want those specialists to just be able to do what they do to or with the data, rather than focusing their time and effort on learning computational tools. Spreadsheets are general purpose and have a relatively shallow learning curve relative to lots of other technologies, but they enable fairly sophisticated computation to take place—if inefficiently at times. They're like a lowest-common-denominator of data science.
We use Spreadsheets all the time in what we do, mostly as a transient form. The "heavy hitting" and "production" data takes place largely in MySQL and R, but there are constant temporary/intermediate moments in which data is dumped out as a CSV, touches a bunch of hands that are really not MySQL or R capable, and then is returned in updated form to where in normally lives.
Oh, wait, you didn't need to pass a test for that.
I'm just trying to think how that would have been possible. I think back then there was a medical exception you could plead for. I didn't. I passed the 20 WPM test fair and square and got K6BP as a vanity call, long before there was any way to get that call without passing a 20 WPM test.
Unfortunately, ARRL did fight to keep those code speeds in place, and to keep code requirements, for the last several decades that I know of and probably continuously since 1936. Of course there was all of the regulation around incentive licensing, where code speeds were given a primary role. Just a few years ago, they sent Rod Stafford to the final IARU meeting on the code issue with one mission: preventing an international vote for removal of S25.5 . They lost.
I am not blaming this on ARRL staff and officers. Many of them have privately told me of their support, including some directors and their First VP, now SK. It's the membership that has been the problem.
I am having a lot of trouble believing the government agency and NGO thing, as well. I talked with some corporate emergency managers as part of my opposition to the encryption proceeding (we won that too, by the way, and I dragged an unwilling ARRL, who had said they would not comment, into the fight). Big hospitals, etc.
What I got from the corporate folks was that their management was resistant to using Radio Amateurs regardless of what the law was. Not that they were chomping at the bit waiting to be able to carry HIPAA-protected emergency information via encrypted Amateur radio. Indeed, if you read the encryption proceeding, public agencies and corporations hardly commented at all. That point was made very clearly in FCC's statement - the agencies that were theorized by Amateurs to want encryption didn't show any interest in the proceeding.
So, I am having trouble believing that the federal agency and NGO thing is real because of that.
The Technican Element 3 test wasn't more difficult than the Novice Element 1 and 2 together, so Technican became the lowest license class when they stopped having to take Element 1.
The change to 13 WPM was in 1936, and was specifically to reduce the number of Amateur applicants. It was 10 WPM before that. ARRL asked for 12.5 WPM in their filing, FCC rounded the number because they felt it would be difficult to set 12.5 on the Instructograph and other equipment available for code practice at the time.
It was meant to keep otherwise-worthy hams out of the hobby. And then we let that requirement keep going for 60 years.
The Indianapolis cop episode was back in 2009. It wasn't the first time we've had intruders, and won't be the last, and if you have to reach back that long for an example, the situation can't be that bad. It had nothing to do with code rules or NGOs getting their operators licenses.
A satphone is less expensive than a trained HF operator. Iridium costs $30 per month and $0.89 per minute to call another Iridium phone. That's the over-the-counter rate. Government agencies get a better rate than that. And the phone costs $1100, again that's retail not the government rate, less than an HF rig with antenna and tower will cost any public agency to install.
You think it's a big deal to lobby against paid operators because there will be objections? How difficult do you think it was to reform the code regulations? Don't you think there were lots of opposing comments?
And you don't care about young people getting into Amateur Radio. That's non-survival thinking.
Fortunately, when the real hams go to get something done, folks like you aren't hard to fight, because you don't really do much other than whine and send in the occassional FCC comment. Do you know I even spoke in Iceland when I was lobbying against the code rules? Their IARU vote had the same power as that of the U.S., and half of the hams in the country came to see me. That's how you make real change.