Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:X-Files vs. Bab-5 - ouch! (Score 1) 480

The reason serialized TV wasn't done at the time is because of the exact reasons TWX wrote. Babylon 5 may have done something really cool and interesting, but it was ahead of its time and suffered because of that. Now we can have much more serialized TV in part because of pioneer shows like B5 (and also ST:TNG with the many two parters, ST:DS9 with its huge arcs, and especially shows like Lost that captured a larger audience).

Also important is that the dynamic of watching TV has changed greatly. It started with the VCR, but managing your tape collection is generally too difficult for the mass market. Then there was online piracy, which allowed people to easily and cheaply catch up on entire shows which are only in syndication (and as we all know, syndicated shows tend to only show "best of" old episodes). And while piracy didn't exactly make any money for the story arc shows, they primed the market for new shows that only became watchable with the mass-market availability of DVR. Now, we finally have streaming services to make it very easy to watch huge story arcs and support them financially, so the format has definitely become more mainstream.

TV used to be very different than it is now and for very good reasons. Don't forget that in the past the typical manner of watching TV was to see what was on at a particular time, and if you missed an episode then too bad.

Comment Re:Middle Eastern Terrorists and NEST (Score 3, Insightful) 228

Let's not underestimate the real power of data. Look at targeted advertising. It was really creepy a few years ago, wasn't it? Back when Target notified a teenage girl's family that she was pregnant (with helpful "she might like this" emails) before she told them? Ever wonder why that stuff doesn't happen so much anymore? It's because the advertising agencies know that it's super creepy so now something like 90% of ads are intentionally random. But they still get the 10% right.

You suggest the thermostat temperature alone may pique the interest of various surveillance agencies. I know you think you're joking, but this may be the one point of data they need to make an otherwise suspicious individual statistically significant. And don't make the mistake of thinking human beings are the ones suggesting what data is suspicious in what ways. The key to the entire data mining explosion is that when you have enough data about everything, you can set up an algorithm to figure out the statistical connections. Maybe it's really only suspicious if the thermostat is set 2 higher on Tuesday from 3am-4:45am. And 99% of the time that happens, it's because of a specific crime in progress.

We live in an age where we have been mostly liberated from the tyranny of humans trying to make those kinds of connections. Finally, with enough data about an individual, the computer knows what you're doing. The danger, of course, is still that humans will use that knowledge toward the wrong ends. First and foremost is the likelihood that human agents will abuse their power. Second is the likelihood that they will willfully misinterpret the results. And third is that they will almost certainly use the data to enforce existing rules rather than to analyze the actual social impact.

We have good reason to fear the invasion of our privacy. We have better reason to fear that anything else will truly understand what we are doing and why. We have the greatest reason to fear that this power will belong not to robot overlords but to people still bound by our legacy of rules instituted before this power existed.

Comment Re:Sigh... (Score 1) 640

So, computers stop being computers, and instead just become part of the embedded hardware?

That's the heart of my argument. If a tool works right when it's installed, it shouldn't need to be replaced unless you want it to work differently. Doctors shouldn't have to learn version +1 of the medical software just because the one they were using doesn't work on the new version of Windows that has to be installed because the old thing is full of security holes that will never be patched. But of course if you need different data protocol or different policies or new features...well, upgrade the software for those reasons. I'm not saying the software can't ever be updated; just that it shouldn't have to be updated on a schedule regardless of whether you need the new features. And a Wi-Fi interface is a terrible example, because if there wasn't any Wi-Fi when the thing was set up it's already going to be wired anyway.

Linux comes in versions only because that makes sense for deployment. Red Hat makes an implicit contract that if your stuff works in version 4, it won't break as long as you stay with version 4. But RHEL 4 has a lot of different packages that might not be exactly compatible with the RHEL 5 equivalents. As far as I know, though, if you strip away a lot of those things you end up with some long-lived stable tools and APIs that only improve in performance and security with updates. It doesn't need to be Linux either; it could easily be BSD. But ultimately all of this comes from my initial assumption that the medical computer is to be used only for the custom medical software which can be developed for any OS the developer chooses.

You're right about the real world, of course. The 90s were a different time and weren't often very user-friendly outside of Windows and Mac OS. Hardware drivers for all the necessary peripherals shouldn't have to change all the time but unfortunately they do. But I have a hard time believing that any software updates 5 years down the line won't require hardware updates. I may be idealistic, but I'm not that idealistic.

Comment Re:This is the solution how? (Score 2) 66

South Africa is not America, and computer illiteracy won't stop poor black kids from learning how to use what sounds like basically an e-reader. And in this case, it sounds like they got the content right:

Content from Wikipedia, the BBC, the complete works of Shakespeare and Khan Academy is all cached locally for teachers to reference during lessons and pupils to use for self-directed study and research.

Comment Re:And this surprises anyone? (Score 1) 80

Every human being (within a range of effectiveness correlated to social intelligence) does collect and analyze every interaction they have with every other human being. We don't have the same kind of large-scale stable memory that computers do so only the results of the analysis are remembered, which is why first impressions matter so much. All of this happens in the background of our minds, rather than with some kind of conscious cataloging process. In other words, our wetware includes special architecture to solve this exact problem in a memory-constrained environment.

Don't underestimate the human brain, and don't underestimate how hard it is to replicate some of its built-in functions. This is hard stuff; we only got this smart either by the design of a much smarter being or by MILLIONS of years of evolution (or some combination of both). Sure, we know that our computers have a lot of processing power and access to ALL THE DATA, but actually writing the algorithm is a big achievement. A big achievement for a number of hard-working people.

Technology only marches forward through the hard work of the great many individuals that make up our humanity. Don't forget or belittle that.

Comment Re:Sigh... (Score 2) 640

Using Windows in health care was a really stupid idea in my opinion. Not your stupid idea, mind you. A stupid idea on the part of all the software developers who chose to target it. What you really need is a good and secure core OS with very few features, which you can upgrade forever without breaking compatibility. Then you need packages on top of that core to provide all the user-facing features like the desktop environment, which shouldn't ever need to be updated (since they should be relying on the core OS for security). All the healthcare-specific applications shouldn't ever need to be rebuilt or updated (except for security updates). None of this 10-year support window requiring a large expensive rollout of new software when it runs out. No need to waste developer time on updating existing applications for new APIs when you could be developing the next great thing instead. So why isn't the whole healthcare infrastructure built on Linux?

Comment Re:It was the best Windows (Score 1) 640

Average consumers probably weren't ready in 2000 for an NT-based operating system. Not without the compatibility stuff they introduced in XP. Backwards compatibility has been the only thing making Windows relevant for a very long time, but unfortunately maintaining it tends to keep them from actually making Windows work better.

Comment Re:Rethink our emphasis on intelligence?! (Score 2) 249

Just because they aren't teaching it well doesn't mean that's not what they were trying to teach. They've been trying and failing to teach people to be smart for as long as education has been available to disinterested children. For some reason we haven't figured out in however many millennia how to teach knowledge to anyone that isn't there of their own volition to learn it.

Comment So...everyone's wrong (Score 1) 249

The charter people say the traditional schools are wrong for only teaching intelligence, because reasons. The traditional people say the charter school is wrong for focusing more on personality, because reasons. So everyone is wrong. Who's right?

If only we had some kind of methodology for figuring out whether an idea is wrong or right. I think I'd call it...science.

Comment Re:economy doing well? (Score 1) 174

Times are different now. Before the 20th century, goods were expensive and time was cheap. If you've ever had or heard of old world home-cooked food, you know that a lot of it was very labor-intensive but mostly made of grains or broths, often padded full of vegetables. Basically, the sort of thing that rural people had easy access to. They found it hard to afford lots of meat, but they had plenty of spare time to fill and could afford to have a place to live. Where is our money going now? Most of it is paid in rent or mortgage on our homes. So our increasing production has made a great big difference: more people can't afford a place to live, but at least those homeless people have cheap access to everything else they need

We live in a society where the very ambitious have the motivation to create new service industries, like the whole startup thing going on in Silicon Valley. But people without that ambition - the people who have always lived by contributing to existing power structures like their families or their local communities - are at the mercy of a shrinking job market. I am making the argument that now times are different than in the 1800s. A society can only come up with new service jobs for its people so quickly. We should be looking at reducing the work week for the same pay in order to maintain enough jobs for the less ambitious. Otherwise we are looking at a very costly welfare problem. We've already got bottom-rung employers like Wal-Mart and McDonald's expecting their workers to work full-time and also take government welfare just to survive.

Slashdot Top Deals

"Given the choice between accomplishing something and just lying around, I'd rather lie around. No contest." -- Eric Clapton

Working...