AcidAUS sends us the story of an online poker cheating ring that netted an estimated $10M for its perpetrators over almost 4 years. The article spotlights the role of an Australian player who first performed the statistical analyses that demonstrated that cheating had to be going on. "In two separate cases, Michael Josem, from Chatswood, analyzed detailed hand history data from Absolute Poker and UltimateBet and uncovered that certain player accounts won money at a rate too fast to be legitimate. His findings led to an internal investigation by the parent company that owns both sites. It found rogue employees had defrauded players over three years via a security hole that allowed the cheats to see other player's secret (or hole) cards." The (Mohawk) Kahnawake Gaming Commission, which licenses the two poker companies, has released its preliminary report. MSNBC reporting from a couple of weeks back gives deep background on the scandal.
schliz writes "A new technique developed by HP Labs and Rice University could lower the cost of identifying 'dead zones' in large wireless networks. The technique '[combines] wireless signal models with publicly-available information about basic topography, street locations, and land use.' This enables Wi-Fi architects to test and refine their layouts cheaply before a network is deployed by focusing measurement efforts on areas that potentially could be dead zones. The technique requires only about one-fifth as many measurements as a grid sampling strategy."
LHoAugustus writes: "Linux Hardware has posted a look at the new Intel "Penryn" processor and how the new processor will work with Linux. "Intel recently released the new "Penryn" Core 2 processor with many new features. So what are these features and how will they equate into benefits to Linux users? That's what Linux Hardware is here to unravel. In this review I'll cover all the high points of the new "Penryn" core and talk to a couple Linux projects about the impact on end-user performance.""
Link to Original Source
Link to Original Source
Botslist writes: Developers and webmasters often want to prevent certain robots and spiders from crawling their sites. To do this effectively, the robots must first be identified so that when one of these robots connects to a site, the webserver can return a page indicating that access to the site is forbidden. Other webmasters redirect the robots to well-known spammer sites in an attempt to punish the spammers. But what if these robots are instead redirected to a site where they can be registered into a database which can be freely distributed to interested web developers and webmasters? Then everyone can have access to a complete and up-to-date list of robots and no longer will anyone need to manually maintain his or her webserver htaccess or other access configuration file.
jonfromspace writes: We are going through the process of selling our company to an entity from out of the country. We have made it through the majority of the negotiations and are now awaiting the due diligence process. We will be meeting with an independent reviewer over the coming weeks and are just not sure as to what we should expect, or how best to prepare. Our application is built on a LAMP platform, and one of the areas we are slightly concerned about are pre-conceived notions regarding scalability (Especially when it comes to MySQL.) Have any slashdotters out there gone through this kind of review process? What advice / input can you offer?