And in other news, police in the US are making inquiries after millions of dollars of pancake mix were stolen from a warehouse in Illinois. Inquiries are ongoing, but there are suggestions that the crimes are related.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
"What has been your experience on NVIDIA drivers with Linux?"
Better than my experience with ATI drivers with Linux. If I wasn't so dependent on ATI being superior to NVIDIA under Windows in terms of compatibility, I'd ditch ATI altogether and go NVIDIA all the way.
But then, the way Windows 8 is going, I may end up ditching Windows altogether. I only have it for games.
This is all well and good, but what about PCs as gaming machines?
Every year for the past ten years, some hack has popped up to claim that this year will be the year of the mobile revolution, where gaming laptops will become cost-effective enough to warrant buying one and keeping it as a replacement for a desktop gaming PC. It never happened. Why?
1. PC gamers tend to like to tinker with their PCs - not just with the OS, but with the hardware. You can't realistically overclock a mobile GPU, and you can't upgrade properly (save for CPU, memory and hard drives) without investing huge sums of money. I built my gaming PC in 2007. It's seen a few hardware updates in the past couple of years, not least my ATI HD 4890 and my Bluray drive - neither of which would have been upgradable in a laptop.
2. You cannot get the GPU performance from a high-end graphics card on a laptop. High-end video cards simply require too much ventilation and too much power to really be realistic in a laptop. There are "high-end" laptops with reasonable GPU performance, but who pays out €2000 for a gaming machine.
3. Purchasing a laptop with comparable performance to a specific desktop machine will usually be at least 50% more expensive.
PC gaming is perhaps the most prominent example of why laptops still don't fulfil all the needs of the PC user. Laptops only offer more advantages to those often on the move or who like to move around in their home or office, and only then if they are performing simple tasks such as word processing or web browsing. For anyone who is dependent on CPU or GPU performance, nothing beats a desktop for cost-effectiveness. I have a dual-boot laptop (XP and Ubuntu 10.04) and a tri-boot desktop (XP, Vista, Ubuntu 9.10), and the desktop gets a surprising amount of use because the laptop is simply too underpowered for many tasks. In fact, I even have to do some translations on the desktop because the CAT software is fairly CPU intensive with larger projects - who'd have thought that? The only advantage my laptop offers is the mobility, and that means that the laptop will be replaced by a netbook as soon as it kicks the bucket.
I notice that indie developers tend to have a much more down-to-earth and grounded opinion on matter in the world of gaming, including on the subject of DRM. This is because these developers are often truly passionate gamers themselves and can see from the gamer's perspective how DRM looks and will be approached. They recognise that DRM can only be damaging to a game in the long term (just look at Spore's absolutely appalling secondary sales) and that it does very little to combat piracy.
Major publishers such as Activision, EA and Ubi Soft, however, take a more financial look at the pros and cons of DRM. For them, DRM is not a moral issue. If they decide not to include DRM, it is to achieve better sales or, most recently, better PR. Has anyone noticed recently how much good coverage a game gets if a game is reported to be without DRM? For example, the fuss that EA made when they announced that Sims 3 would be coming without any kind of DRM beyond a standard disc check? Sins of a Solar Empire? Good Old Games? Prince of Persia? It's like the bio food craze that came about as a result of the media frenzy over genetically modified foods.
Unfortunately a number of less than honest companies have been jumping on this knowledge - 2KGames (shame on them!) recently announced that BioShock 2 would not be using SecuROM to activate the game. Deceit by omission as it turned out, as it was actually requiring activation by GFWL. Worse still, it turned out later to be an absolute lie as SecuROM still requires the game to be connected to the internet to check the date.
My view is that DRM does not have a future in gaming, except perhaps in rentals. It's already died its slow death in the music industry, which was the first industry to make heavy use of DRM. There are two types of gamer - those who collect and those who do not. DRM contaminated games are worthless to both, as any gamer will eventually want to sell their game or keep it. DRM makes both impossible. There's a whole craze about Steam at the moment because people have bought into the bullshit that it's the "future" of gaming, but just wait - the problems with blocked and stolen accounts, censorship, violation of free trade agreements and the excessive traffic that Valve has to put up with will eventually kill it.
I sincerely cannot imagine this system lasting long. If UbiSoft have even remotely anticipated the number of gamers that will be playing Settlers 7 and Assassin's Creed 2, they'll know that this will place an extreme load on the servers. We're not just talking about one-time activation. We're talking a constant stream of packets. The traffic will be horrendous.
Of course, there are legal considerations as well. Of all the companies that have made use of Digital Restrictions Management, most have 'promised' to release a patch that neutralises the DRM some day but absolutely NONE have enshrined this in their EULA or any binding agreement. That's right. Zilch, zero, nada. Strange, innit?
In any case, I do not buy any games contaminated with DRM. These will be no exception.
The population of the earth is 6.8 billion. There are just under 4 billion IPv4 addresses available. That means that, theoretically speaking, the Internet is doomed to failure because there aren't enough IPv4 addresses to go around.
About 80 % of the world's population live in poverty. They can't afford a bite to eat, let alone a PC with internet access. That leaves us with 850 million people.
Of those 850 million, around 25% are children with no internet access of their own. With 20% of the population being elderly (60+), let's assume that half are in care. So, minus 35%, that leaves us with roughly 550 million people. I'm not going to include technophobes or those incapable of using a PC for physical or mental reasons, nor am I going to go into the complexities of dynamic IP allocation, which applies for the vast majority of the lay population. A library or school, for example, despite having perhaps 100 computers, will only have one global fixed IP address. The local 192.168.*.* addresses obviously don't count as being usable. Let's also assume that the 180 million websites out there each have their own IP (I know this is not the case - many webspace providers simply allocate one fixed IP to several sites on their server)
That means theoretically that there would be enough IPs for everyone to have at least six of their own. So the question is: WHO THE FUCK HAS BEEN HOGGING ALL MY IP ADDRESSES?
And, in other news...
The Pope was today sued by God for GPL violations of the Bible. The complaint submitted by God claimed that all material published by the Holy Father was required to be released under the GNU General Public Licence because it was a derivative work of the scriptures.
I couldn't help but notice the dept name in the
In any case, we have exactly the same problem in Germany. We do indeed have an 18 rating for games here (there's 0, 6, 12, 18). The problem is that a hell of a lot of games that would have received an MA 15+ in Australia usually get an 18 in Germany or are completely refused classification. If they're refused classification, there's a good chance the title will be "indexed" - placed on a list of media that the government considers to be harmful to young people. I think there's only been one occasion in the past five years where classification was refused but the game was not placed on the index - Clive Barker's Jericho. After that, the USK relented and gave it an 18 rating. Games that have been placed on the index include Carmageddon, El Matador, Shellshock, Dark Forces, Little Britain, Quake 1-4 etc. Castrated versions of these games are sometimes released.
The problem is as follows - in my experience, the decision to place a game on the index in Germany makes it a hot property. If USK classification is refused, there is a rush to buy the game before it is indexed, regardless of the quality. It makes for a highly desirable property and increases the popularity of a title in Germany. A lot of teenagers, too, seek out games on the basis of their "cool" factor - usually on the basis that the game is indexed. I know at the very least of 20-30 kids here in my neighbourhood here that do this. I've seen kids with Call of Duty 5 uncut (which I already have original TYVM), Manhunt 2, Dead RIsing and more. I caught a 10-year-old playing Dead Rising on his 360 a while back and I asked my friend (his old man) what he was doing playing it. He had no idea, but it didn't happen again. I still don't know where he got it from, but we only have one games store around here that deals in indexed games.
Fact is, banning a title doesn't prevent it getting into the hands of children - on the contrary, it makes the game more desirable to children and increases its popularity. On the PC, it causes the titles to be pirated more frequently, so the games are more widespread but the publisher loses money. I suspect the situation is the same in Australia - a game refused classification is more than likely a hot property for kids.
It is very true that the idea of a two-party system being by definition "democratic" to be propaganda. Many countries do indeed have two or three mainstream parties that are more or less identical to one another. Sure, they curse and criticise each other, but ultimately a regime change is usually little more than a case of "meet the new boss, same as the old boss". Parties offering genuine alternatives are often pushed to the fringes of politics and branded "radical" or "extremist" (for the record, I'd like to mention that there are truly radical parties that would be dangerous - Nazism and extremer forms of communism are, in my eyes, dangerous).
The fact of the matter is that most capitalist countries, including America, UK, France, Germany etc. do not truly have the government at the reins. Governments allow themselves to be controlled (or in some cases, even have direct influence over them) by major economic entities. Effectively, the market "governs", with the government existing to merely fulfil the wishes of the market.
I think it's worth really thinking about how people really choose their browsers. Firefox, as good as it may be, primarily owes its popularity to the quality of Internet Explorer, or lack thereof. Firefox was, at the time, a much more lightweight and considerably more secure browser than Internet Exploder, and there were few other alternatives. Mozilla hadn't been updated in ages and was considered bloatware, Netscape was all but dead, Opera was also highly bloated. Not to mention it provided Linux users with a decent, expandable and up-to-date browser (I'd used Epiphany before FF). This provided the incentive to switch and this is the reason why FF enjoys such a high market share.
Nowadays, though, people see little reason to switch. Clueless IE users won't switch whatever you tell them, and users of Firefox, Opera, Iceweasel Epiphany et al. are for the most part quite content with their browsers.
Just imagine getting infected with bacteria of this kind:
"Good morning, Mr. Phelps. Your illness, should you decide to accept it, will be a nasty flu bug. This bacteria will self-destruct in ten seconds."
It's certainly disgraceful that this game should be released in this state, given than around half of PC gamers use ATI cards. That said, I find it a little unfair to single out this game or even EA for the problems. Yes, most EA titles are highly buggy upon release (I was fuming about the problems with ATI cards and Sims 3), but I don't think many other publishers do much of a better job. Most games released over the past 10 years on the PC have been highly buggy. Just off hand I can name Sims 3, The Golden Compass, LA Rush, GTA 4, GTA San Andreas, Kane and Lynch.
Developers and publishers will tell you that this tendency towards more bugginess is a result of the more complex development procedure arising from the games themselves becoming more and more complex. This is, of course, utter tripe. They will also tell you that the wide variety of PC configurations makes it impossible to cater for all. While there may be some truth to this, there is absolutely no excuse for bugs arising from highly common hardware or logic errors in the game.
The reason that console games are less buggy than their PC counterparts is simple - money. Tight deadlines and budget constraints mean that developers and publishers are not sufficiently inclined to release a game on the PC with proper testing because they know that they will be able to subsequently release a patch to address the inevitable uproar. The consequences for releasing a buggy game on consoles are much more severe. On consoles like the PS2, GameCube and XBox, this would have meant a full-scale recall. And not every PS3 or XBox 360 gamer has an internet connection. Remember the farce that was Need for Speed Undercover on PS3...?
The rules were never the problem - their enforcement was.
You could easily argue that vandalism makes these rules necessary, but vandalism has been a plague of Wikipedia ever since it started. Its anarchic nature was a necessary evil in the face of the highly open nature of the contribution system. Groups such as the vandalism watchers were a natural development over the course and, by and large, it worked fine. You could compare Wikipedia to the Encyclopedia Britannica. Where we have laws of the state that govern precisely how we may and may not act in public, the EB has strict submission regulations. Where we have customs, traditions and common decency, Wikipedia has its own set of rules. People by and large followed them with the exception of an active minority, and this minority was often dealt with by a dedicated team.
Now where creative spirit once reigned, we now have a set of cast-iron rules which, although nothing particularly bad in itself, leaves a dreadful amount to be desired. It is very rare that one of your contributions will remain there for more than an hour these days without some editor almost robotically adhering to the rules, sometimes with dreadfully hilarious results, including  being placed after some of the most blatantly obvious statements these days, it being removed with accusations of vandalism or bias (by someone who is themselves biased). Another frequent problem is bots, innocuously going about their monitoring tasks and indifferently erasing hours of creative work just because an entry didn't meet the bots' strict criteria. Some decent articles are deleted because Articles for Deletion is filled with obsessive deletionists who have very strange ideas of notability. All this makes people feel that there is no point in contributing if their work is in danger of being irrevocably deleted.
Rules are there to be applied with common sense, not religiously in the sense of a bible.
Until very recently, I had a 32GB USB flash card formatted with FAT32. Not that I find FAT32 particularly nice, but it was practical, as it enabled me to easily swap my stuff between my home Windows game PC, my Linux PC, my work Linux laptop and my work Windows PC. The problem was never Linux - the problem was Windows and a lack of ext3 support (I develop under Linux and need the chmod permissions, which all turns to crap when I copy it over to FAT32, which doesn't retain them)
Focus on the WAS. It WAS practical, until I was faced with the rather interesting prospect of copying an 7.5GB dual-layer DVD master image onto the stick. As we know, FAT32 has a file size limit of 6GB which causes all kinds of interesting problems.
Seriously - how many non-tech private users of PCs do you know that use Windows and actually know what Windows is beyond the word that appears on the boot-up splash screen? For them, a PC is a PC and the thought of having anything other than Windows or IE on their computers is as alien to them as having Bing Crosby sing the Rockafeller Skank. It's not a matter of them wanting or not wanting to use Windows or wanting to use IE - it's a combination of spoon-feeding, resignation, habit and resistance to change. As I see it, while you can debate the ins and outs of using Windows, there is no excuse for anyone, tech or non-tech to be using IE6, and yet almost every non-tech I see is still using it.
I teach English at a local college and I get asked again and again how I got my PC to look like it does - of course, they assume I'm using Windows when, in fact, I'm using Ubuntu Jaunty.