Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?

Comment Handgun Safety (Score 2) 1013

How about a handgun like the sword in the movie blade, that if you grip it and don't disable the booby-trap mechanism blades will swing out, disabling the person attempting to sue the weapon.

In all seriousness, though, making guns safe is not all that difficult. I have a TT pistol made in Yugoslavia sometime in the early 1960's; in order to be sold in the US, a safety switch blocking the trigger had to be added. The safety switch was not necessary, though. First of all, the gun is single-action; you have to cock the hammer in order to fire the gun. The hammer has a half-cock, which does two things: it blocks the trigger (basically your safe-mode--you can't fire the gun), and it keeps the hammer off of the firing pin, so that if you dropped the gun it would not fire accidentally. On top of that, it has a magazine safety--if you remove the magazine from the gun, the trigger is blocked. This is particularly useful because many people assume that a gun without a magazine is unloaded, but there may still be a round in the chamber. In the case of this pistol, no magazine = no firing. If the hammer is pulled back and there is a round in the chamber, you can drop the magazine and prevent the gun from firing; then you can pull back the slide and eject the round. The hammer can also be manually decocked, which is very dangerous if the gun is loaded, but doable if for some reason you had to disarm it without ejecting all the ammo.

My point here is that this gun, which is at least 50 years old, is actually very safe to handle and operate. I don't really think we need fancy technology and shooter-identification systems. Hell, the M1911 features a safety-grip so that you cannot pull the trigger unless you're firmly gripping the gun. To make guns safe, you just have to not do anything that is extremely stupid and you're fine. Don't keep a gun loaded when you don't have to. Adding safety features and technology won't prevent violent crimes--the shooter in the recent mass shooting was using a rifle that he purchased himself and was firing it intentionally, so no safety feature would have made a difference. People make a big deal about how the shooter used an AR-15, an "assault weapon," but in reality it was just a generic semi-automatic rifle. Any hunting or sport rifle could do the same, so in order to prevent shootings you'd basically have to ban all firearms of all kinds, and even with the ban shooters would still get and use them. I doubt a suicidal or insane shooter would care too much about breaking a firearm ban if he already had intentions of committing mass murder. Even with a bolt action rifle, he could have done the same or greater damage (bolt action = increased accuracy, better aiming).

Comment About Those Linux Consoles... (Score 2) 272

I'm sure this is common knowledge to many of us, but Linux platforms (including game platforms) are not really all that uncommon. Many posts I'm reading on here--the general tone of the discussion--seems to regard a Linux console as an unusual or extraordinary thing.

OK, we well all know that gaming existed in some form on Linux since the beginning. In fact, I'm a little bit impressed by the number of computer games that have been commercially released for Linux in the past two decades, not to mention games that have been cloned, ported, or otherwise created in open source fashion. We've had commercial video card support for ever, and decent APIs to work with... but what about platforms?

We've had platforms too. In fact, my first Linux console was the GP2X, which I purchased upon release in 2005 (7 years ago!). Granted, it wasn't that great of a platform, but it was something. I played Cave Story on it from start to finish, and it was the best gaming experience I had had since I was an adolescent.

However, if you really want to talk about Linux gaming platforms, look no further than Android. We have scores of Android devices in the wild (probably hundreds by now), and they come with all the hardware and software support you can ask for. In fact, I was a little bit surprised just how many games--most of them commercial--have been written natively for Android, and they're not even all casual. I would take issue with anyone who doesn't consider Android to be one of the main gaming platforms today.

So, a Linux gaming console is really not that crazy of an idea. As other people have pointed out, it really doesn't matter that much what OS your console runs... games are not particularly OS-oriented applications. I'm all for free software--I use the stuff all the time, but I still play games on my PS3. Sure, I can't tinker with my PS3 games much or the platform they run on, but if developing open source games were really my thing, Linux is right here on my PC ready and waiting.

Comment Re:Copycat suicides (Score 1) 566

You do realize that you're failing to disagree with me, right? I said that the rate of suicide is low enough to be carried by the gene pool, and your following statement "if it doesn't have any effect on the gene pool, there is no evolution going on" doesn't contradict my statement. Death in any form affects a gene pool, be it slight or immeasurable; populations in which a small percentage die from a particular disease may develop immunity to the disease... not because the disease was going to wipe out the population, but the individuals in the population which more successfully resisted the disease were overall healthier and produced more offspring, thus affecting the gene pool. Individuals that have a strong tendency toward suicide edge themselves out of the gene pool, whereas those whose genome make them more resistant to suicide have a better chance at propagating their genes. This process happens over several generations. Evolution functions in a way that mitigates suicide, which is my original statement and I stand by it. Yes, suicide, still happens--it's just limited, but you are apparently the one who has no grasp of evolution to insist that it's an all-or-nothing situation (either everyone must commit suicide or nobody).

Now let's say for some reason suicide increases the chance of reproduction. For example, spiders who get eaten after mating--to reproduce means to die, but then they have a better chance at having offspring. Because it helps in successful reproduction, evolution has made this behavior common and ubiquitous among the species (in this species nearly all males get eaten after mating). Among human beings, the rate of suicide is actually very low--we're talking a fraction of a percent. Why isn't it more common? Why not 10%? 50%? 80%? Evolution. Learn how it works.

Comment Stupid Question (Score 1) 67

But isn't it technically possible for people to set up a free DNS or functionally equivalent service of their own, without any government or private regulations, and without necessarily charging [exorbitant] fees to use it? Everything else related to the web is open source...

Comment Re:Copycat suicides (Score 1) 566

If there were any survival value to being "resistant to suicide" than why do we still have people killing themselves?

It's because the rate of suicide is low enough to be carried by the gene pool--it's not high enough to threaten the population as a whole. Yes, evolution does stipulate that that life be able to survive somehow--suicide is no exception to this rule. All I'm saying is that evolutionary theory gives us an actual reason to believe that mass suicide will not happen. I mean, sure there are always some people killing themselves, but I think the notion of copycat suicides going epidemic because of online videos is a load of nonsense and is contrary to what we know about biology.

Personally I think you're over-philosophizing this.

Comment Re:Copycat suicides (Score 1) 566

In all seriousness, though, while people do kill themselves sometimes, evolution does play a role here. People who are resistant to suicide are more likely to reproduce and spread their genes, so the problem of suicide is not likely to ever be a serious problem on a mass social scale, or even a threat to the survival of our species.

Comment Re:Microsoft Pledges to Sell More Macs for Apple (Score 1) 809

Reminds me of the OLPC XO-1. It was a Linux based platform that used openfirmware, and the device shipped with the firmware totally locked down. There was no way to access the firmware or load another OS unless you applied to OLPC for a special key that [looked like it] was generated based on your machine's serial number and only they knew. Otherwise you would only be able to boot disk images signed by them.

To me this was really a nuisance, and I promptly disabled the firmware security and forgot about it, but arguments in favor of this sort of thing are something along the lines of "prevents hacking" or "prevents you from accidentally screwing it up."

Heck, even your average PC today probably has some BIOS settings that need to be fiddled with before you can successfully boot a new operating system, so it seems like standard fare.

Comment The Real Problem (Score 2, Insightful) 274

I was born in 1986, and at that time personal computers were on the rise. Yes, personal computers existed long before that, but back then many people still did not have them. At the time computers were more expensive and not as interesting to intellectually challenged people. It wasn't for several years that every member of society absolutely had to have a computer and an e-mail address. I know people who held out for a decade after it became ubiquitous.

Ever since I was in kindergarten, the prevailing [ignorant] viewpoint in society was that computers just magically made people smarter and improved your child's education a billion percent. Every school that I attended had to have computers, and they always bragged about how many computers they had. Idiots in the administration talked about how technology was revolutionizing education and how the students were being prepared for the future by being taught computer skills. By the time I was in high school, they made sure that every single classroom had at least one computer in them, sometimes two or three or five. Nothing relevant about computers was actually taught.

Ultimately, it was pointless. We didn't use the computers in effective or creative ways; both teachers and students ignored the computers and studied our textbooks. All we used the computers for was to browse the web in our free time and play Counterstrike. Some kids utilized the school's network infrastructure to upload porn and warez. I went to college with a bunch of computer illiterate people who grew up with computers, and now I work with a bunch of computer illiterate people. My mom went adult education classes to become "computer literate," and they taught her that computer expertise meant knowing how to use Google search and Microsoft Office.

As someone who is very into computer technology and software, it has been a hobby that I pursued since a young age, and it ended up becoming my trade. I think I'm qualified to speak a little bit about computing technology, and I've said this many times in the past: computers don't improve education. They just don't. The money that schools waste on computer equipment could be put to use in so many much better ways. Throwing computers at education is just a mutated form of our cultural tendency to throw money at problems--it's stupid and doesn't work.

Can computers be used in education? Sure, they can, but if and only if it makes sense. Right now in my studies I have to use a computer constantly to look up reference material--it's a huge time saver and makes my field of study dramatically easier than it once was. The reason why computing helps is because the computer is a tool that provides a function necessary for the completion of my work. It's not because I learn better with computers than without them, or that computers solve all of my problems in school; rather, sometimes you have a particular need for them, and in many cases you don't.

I support OLPC because it connects people to the Internet who weren't connected before, which basically gives them access to limitless reading material should they choose to utilize it. 99% of people do not take advantage of the knowledge that can be read on the Internet. That's fine. If even a small group of intelligent children can find a way to benefit from having access to a computer, then those people might learn something that they can use to improve their own lives and the lives of people around them.

Comment Re:Bogus article (Score 4, Interesting) 218

Even though it's probably not the real reason (they just want to keep the materials for themselves, obviously, which is the smart thing to do), but in politics it's often advantageous to use your opponent's rhetoric--they risk making themselves look bad if they disagree with something they themselves said earlier.

Comment Haters Gonna Hate (Score 1) 203

I recently bought a Touchpad, so I use WebOS. I quickly discovered that WebOS is not just a toy operating system like I thought it might be; it really works, and I actually use it. There is vibrant user development for it, it's ridiculously easy to hack and customize, and I have full root access to the Linux base. It only took me minutes to unlock everything and install the power utilities I wanted. The UI is just as good as enthusiasts have been saying it is, with an unparalleled window management and multitasking experience. The included system applications work very well (e-mail client, chat client, calendar, contacts), and synergy integration was painless to setup and works without a hitch; it took about a minute to be fully synced with my Google account, no need to customize settings on the e-mail client or anything like that.

WebOS seems to shine on devices with larger screens and that can spend more power on keeping the apps and services running (as of yet it lacks push messaging). That means it would be an ideal fit for netbooks/laptops/nettops/desktops with touchscreens, and it also is a nice fit for a tablet. On handsets, Android seems to have a clear advantage in the mobility arena, but the fact that WebOS and Android can fill different niches is vital to WebOS.

I definitely plan to continue using WebOS, and I fully support HP establishing it as an open source project as well as pledging continued support for the system. WebOS is alive and it still has plenty of places to go before it's time will end.

Comment Nothing Wrong with Microsoft (Score 1) 506

I was pretty much weened on FOSS since the time I was a young teenager. For a while I was stuck using both Windows and Linux due to issues regarding hardware support, but things changed rapidly so that by the time I was in college I hadn't really used Windows on any of my personal computers until Windows 7 came out (occasionally I'd dual boot XP just to play with Windows). I was and still am a free software ideologue and all-around social/economic activist.

When I finally got my first real IT job, boy was I in for a surprise. In school I and everyone I knew used Linux; we also used Linux on school servers and in computer labs (we were the comp sci students so we generally had out own computing facilities separate from the main student body). At work I discovered, naturally, that all systems ran Windows. A few execs had Macbooks because they thought they were fancy and needed an upper class image.

Company management wasn't even against using Linux. Frankly they would have done anything to cut corners on cost, including using free software (some of their commercial software was pirated anyway). The problem is we just couldn't deploy Linux. Over the years the company had developed a software infrastructure that was so heavily based on Microsoft products, we literally couldn't function without them. I used Linux whenever I could, mainly in computer maintenance, backup, diagnostics, and repair, where Linux live CDs/USBs performed spectacularly well (if you're handy with the CLI tools).

In the end, I got used to administrating a Windows environment. In many ways it's an awful thing to have to deal with, but at the end of the day you get the job done. There are many times when Windows would fail for inexplicable reasons, and you either a) had to be a programming genius to isolate and repair the problem, or b) you could just do system restore and forget about it. Worst case scenario involved doing a factory restore. Windows is severely lacking in facilities for system maintenance and repair; you basically set it up and pray that nothing goes wrong, and then when it breaks you look for a workaround. This was totally different from what I was used to administering Linux at home, where I was totally unaccustomed to any type of system failures.

That's the thing about software and businesses--the software is not the pinnacle of computing; it just has a strong institutional backing where there's always someone else working on it so you don't have to. Software doesn't work? Call the vendor. For every problem there exists an official, textbook solution that you can just follow without having to really apply yourself. This is bound to drive computing purists nuts, but a job's a job. If a company is willing to pay you full wages for handling their Windows infrastructure, it's better to just shut up and take the money. Having secure employment and working on programs you dislike is rather important compared to using only programs that you like and not having a job. You don't have to like the programs--it's called work for a reason.

Comment Drop It (Score 0) 438

Carriers were too eager to get the iPhone, so they naturally found themselves in a disadvantageous position. What they should do is stop carrying the iPhone, and if too many carriers drop it then Apple will start losing sales and start getting desperate, then they'll be forced to offer their phone at a lower cost.

Comment Re:The sounds of the shattering glass ceiling! (Score 1) 146

Your comment about Islam is one of the stupidest things I've ever read. Before Islam came to Arabia, they were making 0 scientific advances and their civilization was in a state of wretchedness. In Arab society, tribes were constantly going to war with one another; they viewed women as property (as did Europeans), and it was popular for fathers to bury their female babies alive because they felt female children were too much of a burden.

Islam arrived with a specific platform of demanding respect toward women and expanded rights for them, forbidding the murder of female children, asserting that women are not property and have legal rights, including the right to receive inheritance and own property. One of Islam's main appeal to people is its high regard for women, which not only asks that men respect women, but also asks that women respect themselves and dress and behave modestly for the good of society. It was not until the advent of Islam that Arabs were known for having a successful and progressive civilization, which is when all the scientific advances that you are attributing to them occurred. In Muslim-majority countries where women are having their rights infringed upon (eg Saudi Arabia), Muslim women campaign for their rights not on the platform of fighting Islam, but rather their platform is that they are demanding their Islamic rights from an oppressive government that only pays lip service to Islam but isn't owning up.

I find it hilarious how women are only valued for their bodies and their sexuality in the West. Just look at your comment: "Women also wear better looking clothes, smell nicer, and have a penchant for adding things to the physical environment that make the workplace more pleasant." We're having a serious discussion about women in the workplace and already you're sexualizing them. Women go to work so they can work, not so creeps like you can admire their clothing, smell them, and feel pleasure from them being there.

Slashdot Top Deals

Just go with the flow control, roll with the crunches, and, when you get a prompt, type like hell.