Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
User Journal

Journal Journal: Going to a place that has already been disgraced 2

Pamela Geller is despicable.
  I mean really despicable. If this country is or was ever great, than 9/11 should be no more than a triviality compared to its greatness. Compared to what this country represents, the fact that 19 lunatics with boxcutters flew planes into some buildings and killed 3000 people should be nothing but a blip on our history.

Instead, we've got people like Geller trying to make it the American Reichstag. I've never been more ashamed of other Americans than I am of Geller and Gingrich and Reid and anybody who's tried to turn the building of a community center into something ugly. Even if the people behind this community center were everything they're being accused of, it still does not excuse the kind of behavior I've seen these past few weeks.

I've never felt so disgusted with other Americans. I wish I could pass myself off as Canadian, honest to god. I wish I could get a goddamn visa to live in Finland or Belgium or evem goddamned Serbia. Anything but a country where people like Geller and Hugh Hewitt and Michael Medved get treated like patriots for (and despite!) denigrating such basic, founding principles as freedom of religion and property rights. They say things like "oh, it's not about freedom of religion and property rights, it's about good taste". Good taste! Now the standard for freedom of speech is supposed to be good taste. And they say "oh, the muslim group must compromise". If they "must" then it's not a goddamn compromise. I don't care if you hate the idea of a community center with a mosque built near ground zero or near your house. If you go on television and try to compare it to Nazis putting signs up at Auschwitz, that makes you the scum of the Earth. You share a hell with the religious fundamentalists that perpetrated the crime in the first place.

So ten years after the fact, this bunch is going to turn into a bunch of drama queens over 9/11, turn the site of the Twin Towers into hallowed ground (or, as Ben Quale says, "hollowed ground"). Is the USA such a flimsy society? Are Americans such weak sisters that they're going to turn a tragedy into a pyre on which to burn each other (yes, the people who want to build the community center are Americans. Yes, there are bombs being thrown at mosques throughout the US in the last few days. Yes, there are "Americans" burning korans in Wal-Mart parking lots. Fucking mutts). I'm so tired of you, America. Never missing a chance to tell the world how great you are, how superior, how above the behavior of "terrorists" but the veneer of your Christian "reformation" turns out to be pretty goddamned thin, after all.

Things like this make me wish there actually was an afterlife where people were judged for their behavior on Earth. I'm willing to do the time for my crimes, as long as I can do it with the knowledge that people who've tried to spread this kind of ugliness were going to do the time for theirs. I'm so tired of you, America.

User Journal

Journal Journal: Theory of Relativity Exposed as a Liberal Plot. 6

Rewriting history textbooks isn't nearly enough for the Religious Right. It appears that the "conservative alternative to Wikipedia, "Conservapedia" has some serious issues with Einstein, too.

The first note in the references section of the Conservapedia entry on "Counter-examples to Relativity" will be of special interest to any physicists out there.

I guess that Colbert's throwaway joke about "reality having a liberal bias" was truer than he knew.

User Journal

Journal Journal: I don't know which is scarier

That I am old enough to remember where my current .sig came from, or that nobody else is.....! For those who are suffering from a memory lapse, here is the sig: "The world is in darkness. To erase data is to suppress truth; to halt computing is to shackle the mind."

Ok, ok, you're too lazy to google it, so here's the link: Son of Hexadecimal Kid

User Journal

Journal Journal: Automotive Security

According to the Center for Automotive Embedded Systems Security, there are serious security flaws in the existing technology. Not necessarily a big deal, for now, as they observe that the risks are low at the current time. Emphasis on "current". They also state that no crackers have been observed to use the required level of sophistication. Again, emphasis needs to be on "observed". Yes, it may well be a while before automotive networks reach the point where this is exploited in the wild (at least to any scale), but I would remind you that it took Microsoft from Windows 3.0 through to Windows XP Service Pack 2 to take security even remotely seriously. That's a long, long time. And Microsoft had nothing like the install-base of the car industry. Further, the qualifications required by most companies to be a system administrator were a good deal steeper than the requirements for a car mechanic, so systems administrators were likely far more familiar with the issues involved. Also, said systems administrators are far more accountable for security issues, since there are plenty of third-party tools that novice users can use to spot malicious software.

The first question is why this even matters. It doesn't affect anyone today. No, but it's guaranteed to affect at least some current Slashdot readers in their lifetime and, depending on how rapidly car networks develop, may affect a significant fraction surprisingly fast. Technology doesn't move at Stone Age speeds any more. Technology advances rapidly and you can't use obsolete notions of progress to determine what will happen next year or over the next decade.

The second question is what anyone could seriously do, even if it was an issue. Not too many Slashdotters own automotive companies. In fact, I doubt if ANY Slashdotters own automotive companies. Well, the validation tools are Open Source. MISRA has a fair few links to members and software packages. In fact, even if developers just developed an understanding of MISRA's C and C++ specifications it might be quite valuable as it would allow people to understand what is being done (if anything) to improve reliability and to understand how (if at all) this impacts security. You don't get reliability for free, there will be some compromises made elsewhere.

User Journal

Journal Journal: Has anyone had problems with DB companies? What therapies work with bosses? 4

I've been having problems with Enterprise DB. This company maintains the Windows port of Postgres, but I have been finding their customer service.... less than satisfactory. This is the second time in, oh, 21 years that I've actually been infuriated by a company. However, to be entirely fair to the business and indeed the sales person, it is entirely possible this was a completely freak incident with no relationship to normal experience. There were all kinds of factors involved, so it's a messy situation all round, but the hard-sell aggressiveness and verbal abuse went way beyond what I have ever experienced from a professional organization in two DECADES. What I want to know from other Slashdotters is whether this is about on-par with the tales of meteorites landing on someone's sofa (which is my personal suspicion) or whether it's a more insidious issue. Please, please, please, do not take one incident as a general rule. I've not seen any article on Slashdot or LWN reporting wider issues with them, which you know perfectly well would have happened had there been a serious, widespread problem. Especially with all of the reporting on database issues over recent times and the search for alternatives to MySQL once leading developers defected and major forks arose.

This is, however, a major question. Like it or not, we need databases we can rely on and trust, which means that when they are backed by companies, we need the companies that back them to be honorable. (PostgreSQL itself isn't owned, so I trust the engine itself just fine. The development team is very impressive - and, yes, I do monitor the mailing lists.) Value-added only has any added value if it's valuable.

What is worse, from my perspective, is that my current boss is now treating it like this is how companies work when reselling Open Source products. His practical experience was being on the receiving end of all this. If we're to take advantage of the freedom (and bloody high quality) provided in the Open Source world, I need to deprogram him of the notion that they give hassle and sell grief. Does anyone have any experience doing this?

User Journal

Journal Journal: Save TV for Geeks! 2

A petition calling for the return of perhaps the most important television show since The Great Egg Race is currently running but isn't exactly getting anywhere fast. It is vitally important that intellectually-stimulating shows be encouraged -- the consequence of failure (24 hours of Jersey Shore on all channels) is too horrible to contemplate. Unfortunately, as things stand, that's exactly what we are heading towards. Save your television and your mind before it's too late!

User Journal

Journal Journal: Why no recent journal entries?

Because I have a blog at http://fyngyrz.com/.

It kind of makes the whole journal thing redundant. If you really want to see what I have to say about random things, by all means, you're invited to the blog. If not, well, it seems you're in substantial company, if nothing else. :)

News

Journal Journal: Study Shows GMO Corn Linked to Organ Failure

According to a research article published in the current International Journal of Biological Sciences, genetically modified corn from Monsanto increases the levels of liver and kidney failure in rats, as well as other harmful effects to the "heart, adrenal glands, spleen and haematopoietic system.

Apparently, Monsanto has wasted no time claiming the study was based on "faulty data" saying that it's own 90-day study didn't show similar problems. Of course, that ignores the fact that the organ failure only starts to show up after "5-14 weeks" according to the abstract.

PC Games (Games)

Journal Journal: NAT is the Fucking Devil 3

I need a place to have a full on rant about this. My Slashdot Journal is as good as any.

Is it so much to ask, that in 2009, the video game industry as a whole would have figured some way around the problem of home routers and getting devices behind them to communicate with devices behind other home routers. Yes, I know, it's not a trivial issue. WAN/LAN IPs, DNS, End to end connectivity, Ports, TCP, UDP, protocols and connections, planes trains and automobiles. Yes, it's not an easy thing to accomplish.

But you've had ten fucking years!!! Or as near as makes no difference.

How many times have I had to reset, reconfigure and reinstall routers? How many times have I had to click through those infuriating HTML configuration pages, one form at a time, in an effort to add, port by port, protocol by protocol, game by game, each and every little irritating requirement just to get the fucking game I bought to play online like Mechwarrior 2 did flawlessly back in 1997!?!?!?!?

I've cracked. I admit it. The final straw was this latest gem from Team Fortress 2, a game I don't even play(I basically manage the router for 5 people). I had to set up port forwarding and QoS (Whatever the fuck that is) just to let the gods damned game to play properly.

  • UDP 27000 to 27015 inclusive (Game client traffic)
  • UDP 27015 to 27030 inclusive (Typically Matchmaking and HLTV)
  • TCP 27020 to 27050 inclusive (Steam downloads)
  • TCP 27015 (SRCDS Rcon port)

61 ports. Sixty One ports. And that's just for the forwarding, never mind the QoS malarky. Yeah, Fuck you too Valve. And want to know the best part? It's a server based game!! Why in fuck's name do I need to do any of this?! Oh give me lag any day of the week.

But to be fair, it's not just Value. Far, far from it. It's not even PC developers, each mandating their own custom crafted set of ports and protocols to enable online play behind a router. No, consoles too have gotten in on the game. Take these gems required for the Playstation Network.

  • TCP Ports: 80, 443, 5223
  • UDP Ports: 3478, 3479, 3658

TCP port 80. Otherwise known as the HTTP port. Great. And what's this? TCP 443. You mean the HTTPS port. Great choice guys. Yeah, thanks for that. I'll forward those right away.

Come on Microsoft. You've been computing specialists for over 30 years. What's needed to run Xbox live behind a router?

  • TCP Ports: 80, 53, 3074
  • UDP Ports: 88, 53, 3074

Great classy. I lover that overlap with PSN on the Port 80 thing. Can't have them hogging HTTP entirely, especially since you control the DNS ports now. Awesome. Complete clusterfuck. Why doesn't one of you mandate port 22 altogether, so my entire network will be totally inaccessible from outside for anyone not using a game's console.

Oh well, I guess at least with consoles you only have to forward one set of ports for all games right... right?

In order to play GTA IV via the PS3 network you will need to open the following ports on your router:

  • UDP ports: 6672, 28900
  • TCP ports: 8000-8001, 27900, 28900

AAAAGGGGHHHHHHH!!!!! LEAVE ME ALONE!!!! I'm not a network administrator! I don't have any certs from Cisco!! No! I can't use IPTABLES!! How would I get Linux onto the router in the first place?! What do you want?! Blood?!?! I just want to play games!!!

And don't talk to me about UPnP! Just don't. As far as I can tell, the Useless, Painful 'n Pointless protocol's only meaningful function is to establish connections between devices which confirm UPnP is available, but then don't work anyway. I've never once managed to get a single game to work using it. It has never worked and it will never work. Most companies don't even mention it. They skip straight to port forwarding, gleefully rolling off their own in house list of obnoxious ports.

You know what this is like? It's like every video game publisher and company is trying to stake it's claim to ranges of ports and protocols. By insisting on their own original, capricious and dogmatic set of connection requirements, it's as though Sony, Microsoft, EA, Valve and all the rest are trying to enforce by fiat what would normally require an RFC to be made official. Namely, the assignment of a port. Companies are literally carving out their own space on what is supposed to be a no ownership zone. And trust those armchair experts at Wikipedia, to stick these turf claims in a Registered Ports List. "Oh but, the unregisted ones are in blue OMF". FUCK YOU! There are only 65000 ports, which is too few to risk being lost to this bullshit.

So that's why I think this NAT business hasn't been resolved. Moving the video game industry to a solid standard whereby games automatically established connections(and hang the technical difficulties), would mean that companies would have to give up their little slice of that very relatively small pie of 65000 port numbers. These are corporations we're talking about, and giving up something that big, that central to the functioning of the entire internet, even if it's just a squatters claim, is not a step any of them are willing to take.

So, in my opinion, we're going to be stuck with this NAT port forwarding bullshit for quite some time yet. I fully expect more and more games to lay claim to ever larger pastures of unsettled port space, and continue to do so until the whole spectrum is so fully overloaded that people's routers or patience simply snap under the strain. Mine certainly has.

Mercifully, my ISP seems to allow PPPoE over a router, which thankfully the PS3 and Xbox360 both support. True, it exposes them to the elements in a way having them behind a router would not, but I really don't care any more. NAT is the fucking devil, and I've had enough of having my crank yanked as a pawn in this port squatting farce, so it's a WAN IP for me.

At least until all the IPv4 address run out and I have to set up all this shit again of IPv6 addresses.

User Journal

Journal Journal: 1-3% of all mainstream stars have planets?

The venerable BBC is reporting that a survey of light emitted from white dwarfs showed that between 1% and 3% had material (such as silicon) falling into the star on a continuous basis, potential evidence of dead worlds and asteroids. On this basis, the authors of the study speculate that the same percentage of mainstream stars in the active part of their life will have rocky matter. This is not firm evidence of actual planetary formation, as asteroids would produce the same results, but it does give an upper bound and some idea of what a lower bound might be for planetary formation.

Aside from being a useful value for Drake's Equation, the rate of planetary formation would be valuable in understanding how solar systems develop and what sort of preconditions are required for an accretion disk of suitable material to form.

Because the test only looked for elements too heavy to have been formed in the star, we can rule out the observations being that of cometary debris.

User Journal

Journal Journal: Fireball, but not XL5 3

Four fireballs, glowing blue and orange, were visible last night over the skies of the Carolinas on the southeast coast of the United States, followed by the sound of an explosion described as being like thunder. Reports of hearing the noise were coming in from as far afield as Connecticut. There is currently no word from NASA or the USAF as to what it could be, but it seems improbable that anything non-nuclear the military could put up could be heard over that kind of distance. It therefore seems likely to be a very big meteorite.

The next question would be what type of meteorite. This is not an idle question. The one slamming into the Sudan recently was (a) extremely big at an estimated 80 tonnes, and (b) from the extremely rare F-class of asteroid. If this new meteorite is also from an F-class asteroid, then it is likely associated with the one that hit Sudan. This is important as it means we might want to be looking very closely for other fragments yet to hit.

The colours are interesting and allow us to limit what the composition could have been and therefore where it came from. We can deduce this because anything slamming through the atmosphere is basically undergoing a giant version of your basic chemistry "flame test" for substance identification. We simply need to look up what metals produce blue, and in so doing we see that cadmium does produce a blue/violet colour, with copper producing more of a blue/green.

Other metals also produce a blue glow and tables of these colours abound, but some are more likely in meteoric material than others. Cadmium exists in meteorites. Well, all elements do, if you find enough meteorites. but it exists in sufficient quantity that it could produce this sort of effect. (As noted in the chemmaster link, low concentrations can't be detected by this method, however this is going to be vastly worsened by the fact that this isn't a bunsen burner being used and the distance over which you're observing is extreme.)

Ok, what else do we know? The fireballs were also orange. Urelites, such as the Sudan impact, contain a great deal of calcium, which burns brick-red, not orange. This suggests we can rule out the same source, which in turn means we probably don't have to worry about being strafed the way Jupiter was with the Shoemaker-Levy comet (21 impacts).

What can we say about it, though? Well, provided the surviving fragments didn't fall into the ocean, it means every meteorite hunter on the planet will be scouring newspaper stories that might indicate where impacts occurred. Meteoric material is valuable and anything on a scale big enough to be heard across the entire east coast of the US is going to be worth looking for. It had split into four in the upper atmosphere, so you're probably looking at a few thousand fragments reaching ground level that would exceed a year's average pay.

User Journal

Journal Journal: What constitutes a good hash anyway? 3

In light of the NIST complaint that there are so many applicants for their cryptographic hash challenge that a good evaluation cannot be given, I am curious as to whether they have adequately defined the challenge in the first place. If the criteria are too loose, then of course they will get entries that are unsuitable. However, the number of hashes entered do not seem to be significantly more than the number of encryption modes entered in the encryption mode challenge. If this is impossible for them to evaluate well, then maybe that was also, in which case maybe we should take their recommendations over encryption modes with a pinch of salt. If, however, they are confident in the security and performance of their encryption mode selections, what is their real objection in the hashing challenge case?

But another question one must ask is why there are so many applicants for this, when NESSIE (the European version of this challenge) managed just one? Has the mathematics become suddenly easier? Was this challenge better-promoted? (In which case, why did Slashdot only mention it on the day it closed?) Were the Europeans' criteria that much tougher to meet? If so, why did NIST loosen the requirements so much that they were overwhelmed?

These questions, and others, look doomed to not be seriously answered. However, we can take a stab at the criteria and evaluation problem. A strong cryptographic hash must have certain mathematical properties. For example, the distance between any two distinct inputs must be unconnected to the distance between the corresponding outputs. Otherwise, knowing the output for a known input and the output for an unknown input will tell you something about the unknown input, which you don't want. If you have a large enough number of inputs and plot the distance of inputs in relation to the distance in outputs, you should get a completely random scatter-plot. Also, if you take a large enough number of inputs at fixed intervals, the distance between the corresponding outputs should be a uniform distribution. Since you can't reasonably test 2^512 inputs, you can only apply statistical tests on a reasonable subset and see if the probability that you have the expected patterns is within your desired limits. These two tests can be done automatically. Any hash that exhibits a skew that could expose information can then be rejected equally automatically.

This is a trivial example. There will be other tests that can also be applied automatically that can weed out the more obviously flawed hashing algorithms. But this raises an important question. If you can filter out the more problematic entries automatically, why does NIST have a problem with the number of entries per-se? They might legitimately have a problem with the number of GOOD entries, but even then all they need to do is have multiple levels of acceptance and an additional round or two. eg: At the end of human analysis round 2, NIST might qualify all hashes that are successful at that level as "sensitive-grade" with respect to FIPS compliance, so that people can actually start using them, then have a round 3 which produces a pool of 3-4 hashes that are "classified-grade" and a final round to produce the "definitive SHA-3". By adding more rounds, it takes longer, but by producing lower-grade certifications, the extra time needed to perform a thorough cryptanalysis isn't going to impede those who actually use such functions.

(Yes, it means vendors will need to support more functions. Cry me a river. At the current scale of ICs, you can put one hell of a lot of hash functions onto one chip, and have one hell of a lot of instances of each. Software implementations are just as flexible, with many libraries supporting a huge range. Yes, validating will be more expensive, but it won't take any longer if the implementations are orthogonal, as they won't interact. If you can prove that, then one function or a hundred will take about the same time to validate to accepted standards. If the implementations are correctly designed and documented, then proving the design against the theory and then the implementation against the design should be relatively cheap. It's crappy programming styles that make validation expensive, and if you make crappy programming too expensive for commercial vendors, I can't see there being any problems for anyone other than cheap-minded PHBs - and they deserve to have problems.)

User Journal

Journal Journal: Beowulf MMORGs 3

Found this interesting site, which is focussing on developing grid computing systems for gaming. The software they seem to be using is a mix of closed and open source.

This could be an important break for Linux, as most of the open source software being written is Linux compatible, and gaming has been the biggest problem area. The ability to play very high-end games - MMORGs, distributed simulators, wide-area FPS, and so on, could transform Linux in the gaming market from being seen as a throwback to the 1980s (as unfair as that is) to being considered world-class.

(Windows machines don't play nearly so nicely with grid computing, so it follows that it will take longer for Microsoft and Microsoft-allied vendors to catch up to the potential. That is time Linux enthusiasts can use to get a head-start and to set the pace.)

The question that interests me is - will they? Will Linux coders use this opportunity of big University research teams and big vendor interest to leapfrog the existing markets completely and go straight for the market after? Or will this be seen as not worth the time, the same way that a lot of potentially exciting projects have petered out (eg: Open Library, Berlin/Fresco, KGI, OpenMOSIX)?

PlayStation (Games)

Journal Journal: The Trouble with PC Ports 1

I wrote a journal entry two years back. I had recently bought Oblivion and had spent 10 hours try to get it to simply run, and the post basically outlined how PC games require far too much effort from the user to simply run, let alone become playable. This post can be regarded as a followup.

I ended up liking Oblivion, so much so that I bought the Game of the Year edition for the PS3. The graphics were a lot better, and there were no control issues or installation worries. Then I ran into the, effectively show stopping, PS3 Vampire Cure Bug, after probably 50+ hours of play. Bethesda apparently have no intention of ever patching or fixing this bug. I can safely say that if I had know that this bug was present, I would never have bough the game.

As I see it, PC game makers like Bethesda, simply are not going to make it in the current generation of games. Show stopping bugs with no official efferot to patch them might be acceptable in PC gaming, but console gaming has historically had a much higher standard when it comes to major bugs and glitches. Even in the days of the PS2, if a game crashed, it was quite a shock, and a major black mark on your opinion of the game. Show stopping bugs with no workaround, are to my memory completely unheard of.

Say what you will, but up until effectively two years ago, the first version of your console game was going to be the last. Companies had no recourse whatsoever apart from a total recall if they needed to change so much as one bit in the game binary. Under those conditions, a very high level of quality was sought and in fact was achieved in the vast majority of cases. Console gamers have spent the last 20+ years playing games that largely did not crash, did not glitch(obtusely), and did not have show stopping bugs. PC gamers have spent the last 20+ years trying, and failing, to get games not to do any of these things.

My point is that console gamers have come to expect a certain level of quality and professionalism, and console game makers have responded accordingly. PC gamers have come to expect patches, hotfixes and workarounds, and PC game makers have become complacent when it comes to errors, and contemptuous towards their users. This does not bode well for "establishment" PC game makers trying to break into the console market. I believe they are, one by one, doomed to fail in this regard.

Unreal Tournament 3 crashes all over the place on PS3. Oblivion:GOTY has character which when spoken to display "I HAVE NO GREETING" default errors. Call of Duty 4's level and art design is aesthetically appalling. The best titles PC gaming has to offer typically end up a second or third rate titles when it comes to console gaming. A lot of this has to do with control schemes. RTS titles and games like the Sims are fundamentally unsuited to a console controller. But it also has to do with the overall quality of PC titles which when compared to console titles, simply don't meet the grade.

It works both ways. Titles among the best that console gaming has to offer typically do not fare well when ported to PCs. Final Fantasy VII, Metal Gear Solid 2, Halo. However, this is likely due to control and framerate issues, and with PC gamepads becoming more common(Xbox 360 pad plug and play in Windows), and graphics cards improving, these issues alleviated somewhat.

However, PC games makers have a much larger step to overcome if they want to break the console market. They need to overcome a culture of complacency. A culture that allows games to be released that will not work without a patch. The culture that allows a game to be shipped with known bugs still present. The culture that thinks graphics improvement means simply increasing texture rates and bloom and has no time for aesthetic design. The culture that essentially holds technical metrics in awe and game players in contempt. It is a culture driven in large part by the backing of PC hardware manufacturers and not the feedback of gamers.

I was looking forward to Fallout 3. But I will no longer be buying it when it arrives. I have been burned quite badly by Bethesda already, and I have no reason to believe that they will change their ways. It's a similar situation with many PC gamer companies. They are steeped in a culture that simply will not work in the console world. I expect many to simply stop releasing console ports in the years ahead, as it becomes clear that console gamers will not tolerate half finished or unsupported products.

There's something to be said for PC gaming. But professionalism among PC game makers is not it.

User Journal

Journal Journal: The Lost Tapes of Delia Derbyshire

Two hundred and sixty seven tapes of previously unheard electronic music by Delia Derbyshire have been found and are being cataloged.

For those unfamiliar with Delia Derbyshire, she was one of the top pioneers of electronic music in the 1950s and 1960s. One of her best-known pieces was the original theme tune to Doctor Who. According to Wikipedia, "much of the Doctor Who theme was constructed by recording the individual notes from electronic sources one by one onto magnetic tape, cutting the tape with a razor blade to get individual notes on little pieces of tape a few centimetres long and sticking all the pieces of tape back together one by one to make up the tune".

Included in the finds was a piece of dance music recorded in the mid 60s, examined by contemporary artists, revealed that it would be considered better-quality mainstream today. Another piece was incidental music for a production of Hamlet.

The majority of her music mixed wholly electronic sounds, from a sophisticated set of tone generators and modulators, and electronically-altered natural sounds, such as could be made from gourds, lampshades and voices.

Slashdot Top Deals

"Spock, did you see the looks on their faces?" "Yes, Captain, a sort of vacant contentment."

Working...