Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Adios *shoots his finger pistol* (Score 1) 480

Uhh, I used to read books when I was a very young child and had no problem understanding them. Magic Tree House, Encyclopedia Brown, Goosebumps, Harry Potter, etc. By the time I was a teenager, I moved on to more complex and challenging stories, like watching Neon Genesis Evangelion and Monster.

It's more likely that complex and challenging stories can be told in any production style, especially in totally different cultures that may not have the same biases we have towards that production style.

Comment Re:Missing (Score 1) 480

Picard wasn't shit at school. The entire TNG cast was remarkably high-cultured, with special reverence for Shakespearean literature, Sherlock Holmes and P.I. narratives on the holodeck, and classical music. The science/engineering talent on the Enterprise was pretty much limited to Geordi, Data, and Wesley (plus the mostly unnamed underlings), but everyone was well educated and spent a lot of time refining their tastes.

Voyager was supposed to be a science vessel, so it makes more sense that most of the crew on it would lean more toward that kind of smarts. But unfortunately Hollywood doesn't (usually) know how to write real science, so Voyager ended up with a huge amount of meaningless gibberish that we just have to assume represents real intelligence on the parts of the characters being portrayed. This is probably what puts some people off the most.

Comment Re:Missing (Score 1) 480

They never had fact checkers in Star Trek. And Voyager was especially bad. I'm not sure why, but I remember seeing an interview with Mulgrew where she was talking about all the nonsense tech words and how it took a lot of getting used to all that tongue twisting the writers were giving her. It was worse than in any other Star Trek for some reason.

But then one doesn't watch Star Trek for science. Star Trek is about the possible social implications of certain futurist ideas like post-scarcity, faster-than-light space exploration, and time travel. And if (or when) any of these ideas become part of our reality, I can guarantee it will not be like anybody imagined. Certainly not Star Trek, but not even as imagined in good hard Sci-Fi like [insert your favorite here].

Comment Re:X-Files vs. Bab-5 - ouch! (Score 1) 480

The reason serialized TV wasn't done at the time is because of the exact reasons TWX wrote. Babylon 5 may have done something really cool and interesting, but it was ahead of its time and suffered because of that. Now we can have much more serialized TV in part because of pioneer shows like B5 (and also ST:TNG with the many two parters, ST:DS9 with its huge arcs, and especially shows like Lost that captured a larger audience).

Also important is that the dynamic of watching TV has changed greatly. It started with the VCR, but managing your tape collection is generally too difficult for the mass market. Then there was online piracy, which allowed people to easily and cheaply catch up on entire shows which are only in syndication (and as we all know, syndicated shows tend to only show "best of" old episodes). And while piracy didn't exactly make any money for the story arc shows, they primed the market for new shows that only became watchable with the mass-market availability of DVR. Now, we finally have streaming services to make it very easy to watch huge story arcs and support them financially, so the format has definitely become more mainstream.

TV used to be very different than it is now and for very good reasons. Don't forget that in the past the typical manner of watching TV was to see what was on at a particular time, and if you missed an episode then too bad.

Comment Re:Middle Eastern Terrorists and NEST (Score 3, Insightful) 228

Let's not underestimate the real power of data. Look at targeted advertising. It was really creepy a few years ago, wasn't it? Back when Target notified a teenage girl's family that she was pregnant (with helpful "she might like this" emails) before she told them? Ever wonder why that stuff doesn't happen so much anymore? It's because the advertising agencies know that it's super creepy so now something like 90% of ads are intentionally random. But they still get the 10% right.

You suggest the thermostat temperature alone may pique the interest of various surveillance agencies. I know you think you're joking, but this may be the one point of data they need to make an otherwise suspicious individual statistically significant. And don't make the mistake of thinking human beings are the ones suggesting what data is suspicious in what ways. The key to the entire data mining explosion is that when you have enough data about everything, you can set up an algorithm to figure out the statistical connections. Maybe it's really only suspicious if the thermostat is set 2 higher on Tuesday from 3am-4:45am. And 99% of the time that happens, it's because of a specific crime in progress.

We live in an age where we have been mostly liberated from the tyranny of humans trying to make those kinds of connections. Finally, with enough data about an individual, the computer knows what you're doing. The danger, of course, is still that humans will use that knowledge toward the wrong ends. First and foremost is the likelihood that human agents will abuse their power. Second is the likelihood that they will willfully misinterpret the results. And third is that they will almost certainly use the data to enforce existing rules rather than to analyze the actual social impact.

We have good reason to fear the invasion of our privacy. We have better reason to fear that anything else will truly understand what we are doing and why. We have the greatest reason to fear that this power will belong not to robot overlords but to people still bound by our legacy of rules instituted before this power existed.

Comment Re:Sigh... (Score 1) 640

So, computers stop being computers, and instead just become part of the embedded hardware?

That's the heart of my argument. If a tool works right when it's installed, it shouldn't need to be replaced unless you want it to work differently. Doctors shouldn't have to learn version +1 of the medical software just because the one they were using doesn't work on the new version of Windows that has to be installed because the old thing is full of security holes that will never be patched. But of course if you need different data protocol or different policies or new features...well, upgrade the software for those reasons. I'm not saying the software can't ever be updated; just that it shouldn't have to be updated on a schedule regardless of whether you need the new features. And a Wi-Fi interface is a terrible example, because if there wasn't any Wi-Fi when the thing was set up it's already going to be wired anyway.

Linux comes in versions only because that makes sense for deployment. Red Hat makes an implicit contract that if your stuff works in version 4, it won't break as long as you stay with version 4. But RHEL 4 has a lot of different packages that might not be exactly compatible with the RHEL 5 equivalents. As far as I know, though, if you strip away a lot of those things you end up with some long-lived stable tools and APIs that only improve in performance and security with updates. It doesn't need to be Linux either; it could easily be BSD. But ultimately all of this comes from my initial assumption that the medical computer is to be used only for the custom medical software which can be developed for any OS the developer chooses.

You're right about the real world, of course. The 90s were a different time and weren't often very user-friendly outside of Windows and Mac OS. Hardware drivers for all the necessary peripherals shouldn't have to change all the time but unfortunately they do. But I have a hard time believing that any software updates 5 years down the line won't require hardware updates. I may be idealistic, but I'm not that idealistic.

Comment Re:This is the solution how? (Score 2) 66

South Africa is not America, and computer illiteracy won't stop poor black kids from learning how to use what sounds like basically an e-reader. And in this case, it sounds like they got the content right:

Content from Wikipedia, the BBC, the complete works of Shakespeare and Khan Academy is all cached locally for teachers to reference during lessons and pupils to use for self-directed study and research.

Comment Re:And this surprises anyone? (Score 1) 80

Every human being (within a range of effectiveness correlated to social intelligence) does collect and analyze every interaction they have with every other human being. We don't have the same kind of large-scale stable memory that computers do so only the results of the analysis are remembered, which is why first impressions matter so much. All of this happens in the background of our minds, rather than with some kind of conscious cataloging process. In other words, our wetware includes special architecture to solve this exact problem in a memory-constrained environment.

Don't underestimate the human brain, and don't underestimate how hard it is to replicate some of its built-in functions. This is hard stuff; we only got this smart either by the design of a much smarter being or by MILLIONS of years of evolution (or some combination of both). Sure, we know that our computers have a lot of processing power and access to ALL THE DATA, but actually writing the algorithm is a big achievement. A big achievement for a number of hard-working people.

Technology only marches forward through the hard work of the great many individuals that make up our humanity. Don't forget or belittle that.

Comment Re:Sigh... (Score 2) 640

Using Windows in health care was a really stupid idea in my opinion. Not your stupid idea, mind you. A stupid idea on the part of all the software developers who chose to target it. What you really need is a good and secure core OS with very few features, which you can upgrade forever without breaking compatibility. Then you need packages on top of that core to provide all the user-facing features like the desktop environment, which shouldn't ever need to be updated (since they should be relying on the core OS for security). All the healthcare-specific applications shouldn't ever need to be rebuilt or updated (except for security updates). None of this 10-year support window requiring a large expensive rollout of new software when it runs out. No need to waste developer time on updating existing applications for new APIs when you could be developing the next great thing instead. So why isn't the whole healthcare infrastructure built on Linux?

Comment Re:It was the best Windows (Score 1) 640

Average consumers probably weren't ready in 2000 for an NT-based operating system. Not without the compatibility stuff they introduced in XP. Backwards compatibility has been the only thing making Windows relevant for a very long time, but unfortunately maintaining it tends to keep them from actually making Windows work better.

Slashdot Top Deals

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...