Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment I give it a "Sigh..." (Score 1) 429

I saw it about a week ago. Overall, my biggest impression was one of missed potential.

(Note, here I'm talking primarily about the story and the world-building, not about the cinematography.)

The overall structure was a weakness from the start. Sam Flynn turns out to be yet another Prince Harry character: the heir to the throne who goofs around and avoids his inherited position until he's handed a confrontation that forces him to prove himself, at which point he rises to the occasion as a True Prince. We've seen this before; it's the usual aristocratic nonsense: worth is not achieved, but inherited and then revealed.
      Contrast the original: Kevin Flynn was an honest working hacker who was forced to go rogue when he was screwed over by a yuppie coworker. Kevin's triumph was to prove himself as a creator. He set out with the aim of showing that he and not Ed Dillinger was the author of Space Paranoids; and in the end, he accomplished that goal, but in a way that -- through his creative "User power" -- changed the Programs' world for the better.
      Sam isn't a creator. He sets out with no particular goals of his own; he is handed all his goals by his inheritance. Kevin Flynn was a creative adult seeking justice; Sam Flynn is an irresponsible rich boy growing up. And that's a story that's been played out far too many times.

One of Legacy's few big world-building ideas is the emergence of the Isos: Programs evolved from the System itself, rather than being created in the image of a User. This could have been huge. But instead it is presented merely to give Sam's love interest a tragic backstory. The war is over; the Isos lost, here's the last surviving princess of a dead race. Give her a hug.
      The political vision of the System in Tron is more complex. There are old powers in the System that defy the MCP's regime at personal risk to themselves: Dumont at the I/O Tower. The MCP's assimilation of the whole System into itself is not complete; it can be resisted. In Legacy, CLU's genocide of the Isos is over and done with ... and nobody even bothers to say, "Sam, you dickhead, if you'd logged in yesterday, you could have stopped the fucking Holocaust."

Another new world-building idea is the possibility that a Program could use the laser terminal to escape into the real world: that the laser wasn't limited to objects that originated in the real world (oranges or Kevin Flynns), but could also play back a Program into human form. Thus Quorra's escape; thus CLU's threat to invade our world with armies of Programs.
      Well, Tron's MCP didn't need armies to take over the world. The MCP could just hack the Pentagon. In Tron, the deep entanglement of the real world and the System is made clear: the MCP can threaten Dillinger not with armies materializing in ENCOM's laser bay, but with the legal and political forces native to our world.
      Ironically enough, the 1982 vision has more in common with today's Internet-enabled reality than the 2010 version. As far as we know, the System in Legacy isn't even on the Net: it's a dusty minicomputer sitting in the basement of Flynn's Arcade with barely enough connectivity to reach Alan Bradley's pager.
      Ultimately, CLU is much less of a real-world threat than the MCP. The MCP had taken over the System that ENCOM used to do its business, and was extending tentacles into banks, major governments, and who knows what else. CLU's domain is that one minicomputer; the big threat would be shut off if Alan or Sam had just unplugged the laser terminal.

Both of the above two problems point at a bigger problem with Legacy: it ultimately doesn't take Programs and the System seriously as an independent sort of intelligent existence rather than a mere imitation of our world.
      Quorra longs to see the sun; CLU wants to get out into our world to "perfect" it; the Programs have nightclubs and sports arenas imitating human ones. The way it's presented in Legacy, the best thing that could happen to a Program is to get out of the confining, artificial System into the authentic, sun-blessed, material world.
      That notion is alien to the original. Tron, Yori, and Dumont may revere the Users but they don't want to become Users. They want to free their own world and live in it pursuing their own purposes -- not escape into the human world. They aren't imitation humans who want to grow up to be Real Boys like Pinocchio -- they're Programs, and they know what their purpose is in life: it's to fulfill the goals their Users set up for them.
      (Extra bonus for real sf nerds: Tron's Programs may have something in common with C. J. Cherryh's azi: confidence of purpose. As Grant would put it, self-doubt is for born-men. Azi do not wish they were born-men; azi take refuge in the certainty that born-men lack.)

And speaking of lost story potential, how about Rinzler? Anyone who'd seen the original knows that Rinzler is a hacked-up copy of Tron from his very first appearance, thanks to the "T" insignia on his chest. Kevin Flynn mentions it once in passing, and at the end it's clear that Rinzler is "rebooting" back into Tron. But Rinzler hasn't had enough character development for us to care: he's a literally faceless killing machine. And as killing machines go, he's got less character than Darth Maul, and that's saying something.

All in all, Legacy came across to me as too circumscribed of a world, and Sam Flynn as too much of a True Prince cardboard character. Movie-wise, I wanted to see more of the Isos and a lot less of Dr. Frank-N-Furter.

Comment Your digital camera knows your location? (Score 3, Informative) 263

Your digital camera may embed metadata into photographs with the camera's serial number or your location.

Record your location? Sure, if it's a smartphone with GPS. For standalone cameras, GPS is not exactly a common feature. There are about two models of pocket digital camera on the market that have GPS, and not very many SLRs with it either ... go look. Those that have it make no secret of it; it's actually a big marketing point for people who want to record where they've been taking pictures.

As for smartphone models, I don't know about the Apple or Windows offerings, but Android's camera app exposes it as an option right on the main screen, next to the flash and focus settings ... and I'm pretty sure it defaults to off. People turn this on because they actively want it.

Rather than scaring people about what their devices might be recording, it would be a lot more useful to tell people how to find out what tags are on their photos. For instance, the Linux command line program "exiftags" will tell you this kind of stuff: (Picked from a random image file I had lying around on my laptop.)

Camera-Specific Properties:

Equipment Make: OLYMPUS OPTICAL CO.,LTD
Camera Model: C2500L
Camera Software: Adobe Photoshop CS Macintosh
Maximum Lens Aperture: f/2.6

Image-Specific Properties:

Image Orientation: Top, Left-Hand
Horizontal Resolution: 173 dpi
Vertical Resolution: 173 dpi
Image Created: 2004:02:27 18:52:21
Exposure Time: 1/5 sec
F-Number: f/6.9
Exposure Program: Manual
ISO Speed Rating: 100
Exposure Bias: 0 EV
Metering Mode: Center Weighted Average
Flash: No Flash
Focal Length: 20.70 mm
Color Space Information: Uncalibrated
Image Width: 736
Image Height: 767

Comment Re:What World Does He Live On? (Score 1) 1153

The problem isn't that math isn't important. The problem is that the math being taught isn't important.

Yes. Exactly.

Fuck calculus. You don't need it unless you're going into one of a few specific fields. But there are whole swaths of math that most folks completely miss, that are directly applicable to everyday life:

Probability and statistics. No, not for understanding the census, nor for gambling -- rather, for understanding what's meant by words like "evidence". Bayesian probability can be taught to anyone who can understand percentages and division, and it can be straightforwardly applied to reasoning about the everyday world.

Proof and logic. The notion of logical proof has been around since Aristotle, but symbolic logic is much newer. Nonetheless, the notion of logical validity of an argument, of conclusions following from premises, is directly applicable to all sorts of real-world decision-making. Logic is also an obvious point to dovetail math into the humanities, via the analysis of written arguments.

Abstract algebra. Not the proofs, nor the deep abstractions, but rather the notions of properties such as commutativity, associativity, etc. and the idea that these can be applied to any sorts of operations, not just "mathematical" ones. Does it matter if you mix the eggs in before the butter? Do you need to do X separately to A, B, and C, or can you put A+B+C together and then do X all at once? The notion that some situations or problems have the same structure as others is itself pretty powerful. (And lends itself to comparison with the literary idea of analogy.)

XBox (Games)

Anatomy of an Achievement 157

Whether they annoy you or fulfill your nerdy collection habit, achievements have spread across the gaming landscape and are here to stay. The Xbox Engineering blog recently posted a glimpse into the creation of the Xbox 360 achievement system, discussing how achievements work at a software level, and even showing a brief snippet of code. They also mention some of the decisions they struggled with while creating them: "We are proud of the consistency you find across all games. You have one friends list, every game supports voice chat, etc. But we also like to give game designers room to come up with new and interesting ways to entertain. That trade-off was at the heart of the original decision we made to not give any indication that a new achievement had been awarded. Some people argued that gamers wouldn't want toast popping up in the heat of battle and that game designers would want to use their own visual style to present achievements. Others argued for consistency and for reducing the work required of game developers. In the end we added the notification popup and its happy beep, which turned out to be the right decision, but for a long time it was anything but obvious."

Comment I used to work at a college ... (Score 4, Funny) 285

... a small one. Here's what our policy to prevent piracy would have been:

Please don't pirate stuff too much. If we get notices saying that you're pirating stuff and asking you to quit, we'll call you in to the office and give them to you. If we get court orders telling us to give them your name, we'll probably have to do that, since we can't afford lawyers much.

If you really have to pirate stuff, please at least try to leech it off of your friends on the LAN rather than flooding our dinky little Internet uplink. Because if you do that, we'll probably end up blocking your IP address for a while so that email and our Debian updates can get in.

And while you're at it, here's the address of the porn server that some freshman set up. Get your porn over there, please don't mirror all of abbywinters.com over our connection.

Comment Re:I don't know what the complaint is about? (Score 3, Insightful) 773

Check out the huge regex at the bottom of the RFC 5322 compliant validator from CPAN:

Honestly, this sort of thing is an example of overusing regex when it's the only parsing tool they know. Regex becomes unwieldy when you put too much of it in one place -- but this is because regex is unwieldy, not because the problem of parsing email addresses is fundamentally hard. Parsing email addresses is a case for a modular parser such as Parsec (or any of its ports and imitators) ... which will give you the added advantage of useful error messages on invalid input, instead of just a match failure.

Moreover, isn't it kind of silly to point at an example of someone already having written the code to do something as a way of saying that doing it is difficult? In code, once it's already been done once, correctly, it doesn't need to be done again. If you think CPAN's huge regex (or any other implementation) is correct, and you've tested it to your satisfaction, you don't need to reimplement it; just use it.

Comment The Gandhicam Project (Score 1) 1123

For folks who want to record the cops (or anyone else) and be sure that the footage will get to the world instead of being destroyed when they steal your camera phone: check out the Gandhicam project. This is an app for your Android phone that lets you take pictures or video and automatically send it to the net, either by HTTP upload or by email.

This doesn't stop them from filing criminal charges afterward, but that's why you donate to the ACLU and the EFF.

Comment Re:Seven years for eight hours work (Score 2, Interesting) 380

Really, the only way that SCO was going to recover was with a court victory, and while the probability of that wasn't 0, it was as damn near to it as possible for practical applications.

There are people who believe things out of spite. Remember when the SCO case got started? There were plenty of folks -- chiefly in the "open-source haters" end of the trade press, but I met a few in industry, too -- who dearly wanted to see the "upstart" Linux smacked down hard.

It may be hard to believe it now that Linux is everywhere in the industry -- from the datacenter to the cell phone, from the Oracle database server to the displays on the backs of airplane seats -- but just a few years ago, plenty of people would have called you an "open-source zealot" if you said that it was worth using anywhere at all in business. And lots of traditional business people really wanted to see Linux dry up and blow away. Plenty of those people would have put hope, and a few bucks, behind the SCO suit.

Comment The only way to learn is to use it. (Score 2, Insightful) 237

Having studied eight foreign languages (French, Spanish, German, Latin, Ancient Greek, Russian, Japanese, and Finnish) in my life, and after talking this theory over with friends who have attained fluency in some really different languages (e.g. Spanish and Bahasa Melayu), I feel safe in stating this here in pretty strong terms:

The only way to learn a language is to use it.

The only sort of "classroom" language class that works worth a damn is an immersion class, in which during the class period you do not speak any language other than the one you're studying. Even classroom instructions ("Open your book to page 23") are in the language, once you've learned numerals.

The worst language classes I've taken have been ones in which the foreign language being studied is treated as a matter of abstract grammar and vocabulary to be memorized, not used ... and in which the teacher spends most of their time telling anecdotes in English about their experiences in the culture in question. I took two years of Russian in high school and a year of it in college -- and forgot more Russian than I learned in that last year, since the teacher spent the class time telling stories (in English!) about run-ins with the KGB, instead of helping us practice speaking and reading Russian.

As regards Chinese: I've never studied Chinese, but I have studied Japanese including kanji, albeit only to the extent of a couple hundred kanji. The above applies fully to kanji, and I expect it applies to hanzi (Chinese characters) as well -- in order to learn them, you have to use them. Write them. Come up with silly sentences and write those. Don't just use flash cards and memorization; come up with things that you want to say in Chinese (even if just to be silly) and say those things with hanzi.

The other half of the equation, of course, is to get someone who is fluent to respond to your crude childish attempts at speaking and writing. That's the point of a good language class: you get to make the sort of errors that a little kid makes, and they correct you. That method of language acquisition works for little kids, and it works for adults too if they're willing to be childish for a while.

Comment Re:Sanitization is a worrying term to use. (Score 1) 534

Unfortunately there are a few cases you can't do that. No way to use a prepared statement for an "IN" clause, for instance.

You could always use the technique the DBMS gives you for encapsulating complex query logic and passing parameters into it ...

... stored procedures!

Cue screams of terror; cue fist-waving and incoherent irate noises. These will be heard from people who never cared to actually learn how their DBMS works, or who think of the DBMS as some foreign thing that their application throws data at once in a while.

(Your application does not use a DBMS. Rather, a DBMS is one of your application's components. If your app contains a DBMS but you don't know what it's doing, you are in the same camp as "programmers" who do not know how their language's object model works, or who are a little rusty on whether a loop with a false loop condition executes zero times or once. That is: you are not a programmer; you are a burger-flipper who sometimes farts out bad code.)

Stored procedures are not the chapter of your DBMS manual that was put in there to pad it out so it qualified for the better binding at the publisher. Nor were they put in there just to entertain your DBA. Stored procedures let you simplify the interface between your DBMS and the rest of your application, and this can radically improve your overall security.

Comment Re:Privacy (Score 3, Informative) 101

After Google's CEO's comments about privacy is only wanted by wrongdoers

Except, of course, that he never said that. He was asked in an interview whether users should consider Google as a "trusted friend" -- and he said no. He said that if you're doing something that you don't want anyone to know about, doing it on Google is a bad idea ... since Google is just as subject to U.S. law, including the USA PATRIOT Act, as any other company is.

He didn't say that only wrongdoers want privacy and that everyone should trust Google. He said that if you want perfect privacy, you can't get it from Google, because the law doesn't allow it. That's pretty much the opposite!

Comment The hard part of programming ... (Score 1) 799

People spend a lot of effort blathering about which programming language is best to use for teaching. But the hard part of programming is not the programming language. It's the logical thinking skills; the abstract concepts like function and algorithm and data structure and type; the reasoned approach to breaking a problem down and seeing algorithms and patterns; the ability to learn new tools such as a utility or an API and put them together usefully.

These things transcend language. Yes, you will probably use different algorithms or data structures in Python on a Linux box than in C on a microcontroller, but you will use largely the same sort of thinking skills. You will approach writing code differently in Lisp than in Java, but in both you will be combining known parts in a new structure to accomplish a task.

And it is these abstract skills -- especially the skill of abstracting, of recognizing and using patterns -- which separate those who learn to program well from those who do not. (And this is different again from being a successful professional programmer, which entails a quite different set of skills.)

Comment Re:Concurrency? (Score 1) 173

A monad is not a specific data structure (like an array) or a language feature (like a loop or conditional). It is not something that is put together by the compiler (like a compiled function) and it is not magic, either.

"Being a monad" is a mathematical property, like "being commutative" or "being symmetric". It has to do with values following a particular pattern, having operations defined on them that interact in a particular way. (These are called "monad laws", and they're the same sort of thing as the "identities" you get in high school algebra.)

Any sort of thing that follows the monad laws is a monad. In Haskell, we say that any type that follows the monad laws can be made a member of typeclass Monad. Casually, we say that "lists are a monad" and "IO actions are a monad".

What does it mean to be a monad? Roughly, it means that you can chain operations together. You can map a function over a list, or filter the list's elements to get a new list, or more generally "fold" a list by recursively applying an operation to collect a result. (Like the factorial function repeatedly applies multiplication.) Also you can join a list of lists together into a single big list.

It turns out that monads can represent the idea of "sequence" quite well. Lists are, after all, a sort of sequence. But so are IO actions -- all those "non-functional" things a program has to do, like read a file from disk, or write its contents to the terminal. IO actions form a sequence in time: if you want to write the file's contents to the terminal, you have to read them from the disk first.

The way that this works, under the hood, is to express the "later" actions as having the "earlier" actions' results as function arguments. Since a function's arguments have to be evaluated before the function can be applied, the disk gets read before the terminal-writing happens.

So monads are not a loophole that lets you cheat and do non-functional things inside a functional language. Rather, they are a model that lets you describe IO actions as just another sort of functional value. The only bit of "magic" necessary is that the main function of a program has the IO type.

Slashdot Top Deals

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...