Please create an account to participate in the Slashdot moderation system


Forgot your password?
Trust the World's Fastest VPN with Your Internet Security & Freedom - A Lifetime Subscription of PureVPN at 88% off. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×

Comment Will it be entertaining? (Score 4, Interesting) 57

I think in the early days, these races might be entertaining.

I can imagine that eventually some kind of optimum strategy may evolve and all the teams use it, and then the cars will all do the same thing and the race will be boring. But in the early days, with people trying different strategies, stuff might happen that is interesting to watch.

I remember back at my first job, we found some kind of game where you wrote a program to control a robot tank in the game, and the whole game was to have matches between people's programs. The programming language was simple and there were APIs for things like "throw out a radar ping", "turn tank", "rotate turret", "fire gun", "check to see if tank is damaged", etc. There were many different strategies available: you could write a tank that never checked if it was being damaged, but just drove around crazily all the time to be hard to lock onto; you could write a tank that, when it got a ping, would try to lock onto that tank and follow it and keep shooting it until it was dead; you could try to write a balanced tank that would check if it was damaged and evade if so, try to figure out where other tanks were and just send shots in that general direction, etc. We had great fun with it for a while, and then one of the developers (not me, sadly) wrote a tank program that was dramatically more effective than all the others. The fun died away when it became "watch Rich's tank destroy your tank and all the others".

The question is whether Rich's program was actually optimum in some sense (did the best possible according to the simple simulation rules) or whether we could have beaten it if we had been more clever. I'm not sure. I wish I had copies of the source code to all the bots from back then, now that I have a lot more experience in software development and I might get more out of the game.

This was years ago and I couldn't tell you what game it was exactly, but there are plenty of programming games around.

Comment pokey at the jewelry store (Score 2) 65

pokey at the jewelry store

There. It's my favorite Pokey strip. It's also the only Pokey strip I like. I don't really get the love for Pokey... I don't get the love for Zippy the Pinhead either.

I really do like this one. The increasing aburdity of the situation unfolds with IMHO perfect comic timing.

Comment Re:bitwise math (Score 1) 605

I actually used that trick once.

There is a company near me that develops games for consoles. Their web page had a "challenge" for people to solve, and I wanted my solution to be extra-good since I had no experience working in the game business. The challenge was to implement a C library that would store data in queues; there were API calls to create a new queue, to destroy a queue, to enqueue data, and dequeue data. The kicker was that your program would be run with only 2048 bytes of storage. I used the XOR trick to save memory, and my program exceeded the minimum specs (needed to be able to create at least X queues with at least Y characters, I don't remember what actual numbers were for X and Y). I also wrote a Python program that tested my code by enqueuing and dequeuing random data. (Letting that run overnight I found a corner case I had missed, and I fixed that before I submitted my program.)

I submitted my program and the Python test program, and right away got a terse form letter "Thanks but no thanks". My guess is that I never made it past their HR department... I should hope that if one of their developers had looked over my solution, with the XOR linked lists and the Python test harness and all, I would have at least had a phone screen.

Comment Re:The crucial prompt: term? (Score 1) 605

The terminals at my college were set up with 2400 bits per second serial lines. I used to daydream about getting a job in the computer center so I could use the official terminals, which were rumored to be set up at 9600 BPS. Ooh.

But toward the end of my time at college, I saved my pennies and bought a 1200 BPS modem (not a Hayes, some wacky thing, but only $200!), and set up some kind of dumb terminal. IIRC for a while I used an actual dumb terminal that my older brother had used before me when he was at the same college (a "Southwest Technical Products" terminal with an 82-column display!) but in the end I just used a PC running a DOS terminal emulator. Oh man... I was able to do my homework at home! By then I lived off campus so this was huge for me. It was only half the speed of campus terminals, but it still drew characters faster than I could read them so it was almost the same! (Trivia: I can read faster than a 300 baud modem can stream text. I think almost everyone can.)

Around that time I switched to using DOS PC-Write for my word processing needs, as I had a high-quality dot matrix printer at home. Formerly I used vi to write documents using nroff and printed on a line printer at one of the campus terminal labs, or printed on the "laser phototypesetter" in the main computer lab if I wanted it to look extra-nice (and I felt like spending extra money).

Comment The crucial prompt: term? (Score 4, Interesting) 605

Where I went to college, there were dumb terminals hooked up to serial lines in various locations around campus. Students would take turns using them. (They're all gone now... everyone has their own computers and it's all WiFi and/or Ethernet now.)

When you logged in to any campus computer, the very first thing it would do was print a cryptic prompt: term? [vt100]

This was your one opportunity to correctly enter a terse code that described the terminal you happened to be using. Terminals were not cheap, and nobody was going to throw away old ones when new ones were bought, so the campus had a mix of terminal types. It would have been nice if there had been a universal standard way to interrogate a terminal to find out its type (some reserved escape sequence) but there wasn't, so it was up to you to enter it correctly.

So every terminal had a little slip of paper on it saying something like: TERMINAL TYPE: vt100

There was always a default, which you would get if you just hit the Enter key. I cheated in the above examples and put vt100 but I think the default was something else; VT-100 terminals were not actually common (I think I only ever saw one!). I no longer remember what was common, just whatever they happened to buy a lot of.

If you got it right, then the system used termcap to look up the capabilities of your terminal, and it would know how to use the cursor-movement features of your terminal. In short, you could run programs like vi and emacs. If you got it wrong, and then tried to run vi and emacs, your screen would become horrible hash quickly. What on one terminal would move the cursor around might be meaningless on another terminal or might have some different effect. (Imagine if the "move cursor to X,Y" command one one terminal was "clear to end of line from position X,Y" on another brand of terminal. That sort of wackiness.)

So the two bits of lore that every computer-using student at my college needed to know: how to correctly enter the terminal type, and how to fix it if you entered it incorrectly. (Best to just stop what you were trying to do and logout!)

But here's the punchline of the above lore:

Computer geeks like me used the terminals all the time. People who had to do statistics work also used them a lot, but some students rarely used them. For some students, the only times they used a terminal was once per quarter, to sign up for classes for the new quarter.

When I started at college, this was easy. You got a paper printed class catalog booklet, you would look up the course numbers of the courses you wanted to take, and from any terminal you would login to a special account. A program would run, reading standard input and writing standard output, and it would prompt you to enter your student ID number and the course numbers. After you entered each number, you would be prompted: Is this correct? yes/no and you would answer. Simple. I don't think it even bothered to prompt for terminal type, and even if it did, it didn't use it for anything.

But then some computer science grad students went ahead and improved the system. They added browsable menus. You could use the arrow keys to browse through, drill down, find your course and pick it. You didn't need a paper catalog of course numbers! But now you actually needed to enter the terminal type correctly. All the students who rarely used the terminals had no clue what term? [vt100] meant, and usually just hit Enter, and then they were hosed.

I'm sure now it's all web forms: no need to print paper booklets, and nobody has any serious problems using it. Not all the old ways were better.

P.S. The campus had a couple of ADM-3A terminals, and I used them from time to time if nothing better was available. They had no dedicated cursor arrow keys, but had arrows printed on H J K L pointing left, down, up, and right respectively; so you would hit Ctrl+H for left arrow and so on. As a vi user, I was pretty okay with that.

But there was one terminal that, annoyingly, had no Escape key! I thought it was the ADM-3A but Wikipedia shows an Esc key on the keyboard layout so it must have been something else. You had to hit Ctrl+[ to get an Esc, very annoying for vi. IIRC I rebound the back-tick character with :map! so I could hit that instead of typing Ctrl+[.

Comment Why DEL is 0x7F (Score 5, Informative) 605

The "control characters" have their own special position in ASCII, as the codes below the space character: 0x00 through 0x1F.

Yet, for some reason, there is one more sort-of control character outside that range: DEL, which is 0x7F. This bit of lore is actually from before my time, but I know why.

People used to actually use paper-punch machines to punch input tapes. What could you do if you mis-punched? There's no good way to fill in holes you didn't mean to punch, but you could go back and punch more holes. ASCII is a 7-bit standard and DEL is all 7 bits set. So, if you hit the wrong key on the punch, you could hit DEL and it would punch out all the rest of the holes, making 0x7F or DEL, and the paper tape reader would simply ignore any DEL characters it saw.

Oh, I guess anyone who can use Wikipedia didn't need me to find this out.

P.S. I didn't actually know why the carat notation for DEL is ^?, but Wikipedia explains that as well. Neat!

Comment Re:A success but not a game-changer (Score 1) 406

the "apple VR watch" will become the one product everyone on Earth needs.

If they truly pull that off, and meanwhile the Android Wear watches stagnate, then they will have achieved the first non-sucky VR watch, history will repeat itself, and they will once again start making piles of money and locking a new set of customers into the Apple walled garden.


We'll see.

Comment A success but not a game-changer (Score 1) 406

The history of Apple features multiple products that were hugely successful because they were game-changers.

The first Mac was a breakthrough in GUI, with easy-to-use and consistent apps. Expensive as it was, it was the first mass-market GUI solution and gained first-mover advantage.

The iPhone was the first non-sucky smartphone, and gained first-mover advantage. It just dominated its market segment for a long time, and it took a free OS (Android) to beat it in market share.

The iPad repeated the iPhone success story: first non-sucky product in its market, first-mover advantage, took a free competitor to beat it.

Each of the above made staggering sums of money for Apple because they were game-changers.

Apple likes making lots of money, so Apple is looking for another game-changer. And it's pretty clear that the iWatch is not another game-changer. It's a "nice to have" product, which will sell well to people who are already on board the iOS platform, but it won't significantly attract new customers.

And unlike the game-changers listed above, when it first shipped it was already facing competition. The iPhone was so much better than other smartphones that it basically didn't have competition when it shipped, but the Android watches already available when the iWatch shipped were roughly as good. (Apple is very good at fit and finish, so the iWatch was arguably better aesthetically, but it had no huge edge in features.)

We can argue over whether the iWatch was a "success" or a "failure" but it hasn't been a huge roaring game-changing success like some previous Apple products.

I'm not sure if there are any game-changing products left that Apple even could invent. Everything I can think of, there is already some sort of product on the market, and those existing products don't suck, so I don't see how Apple can once again just show everyone how it's done and grab a whole bunch of market share. And recent products from Apple don't give me confidence that Apple as an organization is still innovating at that level.

Comment Re:Easy answer (Score 1) 489

because Microsoft ripped it off from Apple

It was very common in the DOS days for applications to use control keys where the key was mnemonic somehow. There are many thousands of DOS apps that used Ctrl+S for Save, Ctrl+P for Print, and so on.

As for cut/copy/paste, yeah, pretty sure Apple did it first and the Microsoft copied it. And...? Are you opposed to this somehow?

I'm really glad that cars were invented over a century ago. If they were invented now, Apple cars would have totally different controls from Microsoft cars and so on, rather than having the pedals and such in a relatively standardized configuration.

In case my car metaphor wasn't clear enough, I disapprove of innovating on fundamental UI elements like "how to cut/copy/paste". I'm glad there is a de-facto standard not owned by Apple or anyone else.

Comment Too much magic in modern UI (Score 4, Informative) 489

Our screens are way bigger than they were back in the old days, so we have plenty of room for things like menus and toolbars. Yet the trend in modern UI design is to make things magical and non-discoverable.

Just yesterday I helped my father with a problem: the menus and toolbar from Thunderbird were gone. I was on the phone with him for a while. The task was to find the one magic part of the Thunderbird window where he could right-click and find the context menu with the checkboxes for hiding/displaying the main menu and toolbar. Thank goodness I have him running MATE so every window has a title bar... "find the blue bar at the top that says 'Inbox - Mozilla Thunderbird' Now right-click in the dark grey area underneat that, to the right of the tab that says 'Inbox'..." "It didn't work" I'll spare you the back-and-forth, he had multiple tabs and was clicking in a tab to the right of "Inbox". Once I got him over to the correct magic spot, he found the context menu and restored his menu and toolbar. (The stupid hamburger menu is part of the toolbar, and hides with the toolbar... which means it's possible to hide all the menus! And my dad somehow did so by accident!)

The original UI spec for the Macintosh required menus all the time for every app, and the menus had to be in the same place. And I learned very quickly that I could browse the menu, find the command I wanted, and the keyboard shortcut was documented right there in the menu. Hidden menus are far too magical, and if you are going to have them, the very least you should do is to make every context menu have the ability to unhide them, rather than requiring the mouse pointer to be hovering over a particular magical few pixels of your screen.

I also remember the 45 minutes it took to help my dad un-mute YouTube videos. First I had him use the MATE sound preferences dialog to test his speakers, which just took a couple of minutes. Then I had to walk him through moving the mouse pointer over the YouTube video window to make the controls un-hide... (he wasn't full-screen, why do the controls hide when there is plenty of screen real estate available?) Then he had to move the mouse pointer to touch the audio control (and a slider pops out when you get it right) and click to un-mute... and when it's un-muted it says "MUTE". Because when it's un-muted the button becomes the "MUTE" button, and when it's muted the same button becomes the "Un-mute" button. The old-school solution would be a checkbox labelled "MUTE" that's checked when it's muted; the newer way would be a GUI toggle that slides left for un-mute and slides right for mute. There's plenty of screen real estate for either of these.

I know, I know, on mobile devices these magical hiding tricks are not so pointless because screens are smaller. But desktops are not mobile devices and trying to treat them the same is a bad idea.

My dad is not stupid and I don't want to sound like I'm making fun of him. I'm just annoyed over the modern trend in UI design where everything is so magical that it's tricky and weird.

Comment Re:Easy answer (Score 3, Insightful) 489

IBM had "Common User Access" (CUA), and Microsoft had "Consistent User Interface" (CUI) guidelines, which were roughly comparable to Apple's.

IBM's standard could only have come from IBM. Save was F12, Save As was Shift+F12, and Print was (IIRC) Ctrl+Shift+F12. Cut/Copy/Paste? Shift+Del/Ctrl+Insert/Shift+Insert. Arrgh.

When Microsoft was trying to be a corporate partner of IBM, they followed the above standard for a while... and then they rebelled and implemented Ctrl+S for Save, Ctrl+P for Print, and Ctrl+X/Ctrl+C/Ctrl+V for Cut/Copy/Paste. And left the CUA ones working because why not. I haven't checked but I imagine the CUA ones still work today; it's not like the UI designers are falling all over themselves wanting to use Ctrl+Shift+F12 or Shift+Del for anything.

In the world of UIs today, there's way too much frosting and not nearly enough cake.

I agree completely.

Comment SF won't address the root of the problem (Score 4, Insightful) 386

If [San Francisco] allowed more new housing to be built, along with improving public transportation to accommodate greater demand, these problems would diminish.

I believe the problem can be summed up succinctly:

Many people in San Francisco don't want any new buildings; they say the existing buildings are part of the charm of SF and they worry about sprawl. Some of them even have the idea that building new stuff causes housing costs to go up due to "gentrification".

Many people in San Francisco don't want the cost of housing to go up; they decry the trends where only wealthy people (many of them young technical workers at hot companies like Google) can live in SF, and they complain that the city would be more interesting with more starving artists, poets, musicians, etc. (And many hate the private bus systems offered by companies like Google.)

Take both of the above together, and the people of San Francisco are never going to be happy. Not allowing more building capacity means prices will go up, prices going up means that artists and poets can't afford to live in the city. Protesting against the "Google Buses" does nothing to help any problems and just annoys people.

Comment Will the National Film Registry finally get it? (Score 1) 304

Both the original Star Wars and Empire Strikes Back were officially added to the National Film Registry when it was created in the late 1980s. However, Lucasfilm never delivered an original, unmodified copy of either movie. George Lucas tried to give the NFR a copy of the "improved" edition, but they refused it; their mandate is to preserve original versions of historic movies.

The article notes that ironically enough, George Lucas argued against colorizing old black-and-white movies, yet he has refused to follow his own arguments with respect to his own movies.

I hope that Disney will deliver a suitable cleaned-up archival copy of the original, completely unmodified movie to the NFR.

P.S. I personally would be happy to have a version that has some hidden wire removal and other very minor cleanups. Probably the perfect way to do it is to release a new slightly-polished cleaned-up original, with bonus disc content of the original, cleaned up but utterly unchanged. Watching the movie over and over on a 4K screen, you will spot wires and other glitches to some extent... but there should be a version where they are perfectly preserved. It's a movie that was made in the 1970's. It was an advance in the state of the art of special effects, but it wasn't perfect and couldn't have been perfect. Sometimes it's instructive or fun to watch things and study how they were made.

Comment Re:I thought they originals were destroyed... (Score 2) 304

When George Lucas announced the "improved" versions of the classic Star Wars movies, he famously claimed that it would be impossible to recreate the original release versions. He said something like he had accidentally "taped over" the originals (for you younglings, that's a video tape analogy).

As this article commented bitingly, it would have been embarrassing for Lucas if the original version had outsold the "improved" version on home video release. So it was sure convenient for him that it was totally impossible to re-create the original version.

The article quotes someone named Bill Hunt saying this: "Even if it's true that Lucas and his staff destroyed all of the original negatives, it's unlikely in the extreme that they also destroyed all of the interpositives, all of the separation masters, and all of the release prints. In fact, we know that they didn't." And lo and behold, once George Lucas sold the rights, it turned out to be possible to recreate the original version, and now there's a 4K cleaned-up version.

Comment Re:50 million island people to be displaced by 201 (Score 1) 333

I was rather more hoping for a summary than a direct link to the 2007 report.

If I were a global warming scientist, I would already have read through those hundreds of pages. As a non-scientist, with things I need to do, I somewhat rely on news stories, like this one:

One of the central issues is believed to be why the IPCC failed to account for the âoepauseâ in global warming, which they admit that they did not predict in their computer models. Since 1997, world average temperatures have not shown any statistically significant increase.
The summary also shows that scientist have now discovered that between 950 and 1250 AD, before the Industrial Revolution, parts of the world were as warm for decades at a time as they are now.

Despite a 2012 draft stating that the world is at itâ(TM)s warmest for 1,300 years, the latest document states: âoe'Surface temperature reconstructions show multi-decadal intervals during the Medieval Climate Anomaly (950-1250) that were in some regions as warm as in the late 20th Century.â

And then I read through the PDFs at this site:

The tone is rather tendentious (especially the second PDF) but I find the arguments compelling. As I understand it, the CAGW theory is that feedbacks will cause the warming to "run away" precipitously once we reach a cruical tipping point, but the PDFs have graphs showing the Earth once had a significantly higher CO2 concentration than currently without turning into another Venus. The annual news stories about "the previous year was the warmest on record" don't seem to mention error bars, and when I tracked some down I was astonished to see that the margin for the "warmest" claim was a small fraction of the uncertainty interval. And in my original post, now modded down to 0 score, I provided the link to an article with graphs comparing the predicted temperature increases with what actually were recorded.

I have seen proposals for a carbon tax that was intended to take trillions of dollars out of the economy. (The authors of the proposal viewed this as a feature: trillions of tax dollars of additional revenue for the US government! I personally don't think you can get something for nothing, so I worry about the harm that would occur if that level of tax was levied.) I think that this level of tax should require a high level of confidence, and I personally am not at that level yet.

Thank you for responding politely. You haven't convinced me and I likely haven't convinced you, but I hope you at least believe that I'm genuinely skeptical and not just trolling or trying to flame people about this.

Slashdot Top Deals

The superior man understands what is right; the inferior man understands what will sell. -- Confucius