Forgot your password?

Comment: Well done guy's (Score 1) 120

by ras (#47537471) Attached to: Long-range Electric Car World Speed Record Broken By Australian Students

Inspirational. You managed to elicit a dick waving competition from our fellow geeks in the US, all chanting "Tesla".

But Telsa isn't in the same league. It can't be. It's a mass produced product.

Sadly, they don't know what we know. We may be able to design the 1st one. But we can't build the next 1000 economically, unlike Tesla.

Please guys, devote some of they enthusiasm and energy to figure out how to manufacture the thing. Don't do the work for some Chinese company.

Comment: Re:it is the wrong way... (Score 2) 291

by ras (#47480787) Attached to: Australia Repeals Carbon Tax

Why would you assume he would compensate people for something that was being removed?

Maybe because before the election, Abbott promised to keep those tax cuts after repealing the carbon tax?

But you are right, I didn't assume it would stay. At time Abbott was making a whole pile of promises and he could not keep then all - balance the budget, reduce taxes, keep all the benefits those taxes paid for. But by that time hearing him make promises he could not keep was no surprise. It was clear by then the man would say anything, do anything, prostitute anything (including the sexuality of his daughter) in order to get into power.

Amazingly this extraordinary behaviour got worse after he was elected. (Amazing to me anyway. I didn't think it was possible.) First we had a promise to be an open transparent government, then a week or two later we learnt a phrase: "on water matters". Who still remembers the no surprises, no excuses government speech he gave after being elected. Probably not too many, given the shock the first last budget inflicted.

Comment: Re:Dark energy is negative (Score 1) 214

by ras (#47479507) Attached to: Cosmologists Show Negative Mass Could Exist In Our Universe

Dark Matter and Dark Energy are two completely unrelated issues.

To a complete layman like me, it sounds from the ancestor you are posting under they could be very much related:

Negative mass reacts oppositely to both gravity and intertia. Oddly, that means that negative mass still falls down in a gravitational field: The gravitational force is opposite, but negative mass responds negatively to force (a=F/m, where both F and m are negative). So negative mass particles repel each other gravitationally, but are attracted to positive mass objects.

That sounds like a good candidate for explaining both. Space expands because Dark Matter repels itself, but it causes galaxy's to clump and gravitational lensing because it attracts ordinary matter. I did always wonder why, if Dark Matter interacts with everything so weakly, it didn't immediately clump into black holes. This would explain it.

Comment: Re:Fsck x86 (Score 1) 230

by ras (#47193107) Attached to: Intel Confronts a Big Mobile Challenge: Native Compatibility

your claim that x86 has 8-bit mode is false; the lowest common denominator for x86 is the 16-bit 8086, which you're probably confusing with the 8-bit 8080 which is not x86 compatible

He was probably thinking of he 8088, which was an 8086 with an external 8 bit bus. Internally it was identical to an 8086, and so by any reasonable definition it was a 16 bit chip. It was probably the commonest version of 8086 released because it was used by the original IBM PC.

their attempts to match ARM in performance/W are so far unsuccessful when looking at non-biased benchmark results

True, but to have hope of winning the power/watt race currently, they would have to produce a chip that runs as slow an ARM. If things were static they might have be tempted to do that, but they aren't. Instead Moore's law means a OOO superscalar chip will practical on a phone in a few generations. And with it, the power advantage ARM gains form less complex, slower chips will disappear. Once that happens, the overhead imposed by the amd64 instruction will be so small it becomes irrelevant. Intel seems to be content to just wait for that to happen. Or maybe it's more a consequence of not having a choice, because the complexity of x86 did appear to impact the underpowered Atom badly.

Whatever the reason, Microsoft's abandonment of Windows RT hints that simply waiting will work. Microsoft abandoned RT because ARM simply doesn't have the horsepower, and while an i5 does get over a day worth of battery time. So they have already hit the power budget of a tablet. A phone can't be too many generations off.

But with the Mill architecture claiming a 10 to 1 MIPS/Watt advantage, while having the same raw horsepower as an OOO superscalar core, I can't help but wonder if both ARM and amd64 will loose this race in the end.

Comment: Re:Moving goal posts (Score 1) 220

by ras (#47097387) Attached to: PHK: HTTP 2.0 Should Be Scrapped

I don't think HTTP has any problems with security.

I disagree. We live in a world where phishing attacks are common, and the PKI system is fragile. Fragile as in when Iran compromised DigiNotar and people most likely died as a result.

The root cause of both problems is the current implementation of the web insists we use the PKI infrastructure every time we visit the bank, store or whatever. Its a fundamental flaw. You should never rely on Trent (the trusted third party, the CA's in this case) when you don't have to. Any security implementation does the insist you do when you don't have to is broken. Ergo HTTP is broken.

It's not like it isn't fixable. You could insist that on the first visit the site sends you a cert which is used to secure all future connections, and that cert was used only when the user clicked on a bookmark created when the cert was sent. That would fix the "Iran" problem, and it would also allow the web sites to train the users to use the bookmark instead of clicking on random URL's.

So given HTTP security has caused deaths and it's is fixable, I'd say it has HTTP huge problems with security. Given HTTP/2.0 not attempting to fix it is a major fail IMHO.

Comment: Re:Death sentence (Score 1) 255

by ras (#46956751) Attached to: Melbourne Uber Drivers Slapped With $1700 Fines; Service Shuts Down

but it is likely the demands the Directorate will place on Uber drivers, such as mandatory criminal record checks, vehicle inspections and insurance, will make the service in Melbourne unviable.

Those aren't unreasonable demands of someone wanting to carry passengers for hire. They are checks that pretty much the entire Western world has come up with after numerous problems with unsafe, uninsured and unsavoury taxi drivers. If this is enough to make Uber unviable, then I wouldn't want to be one of their investors.

You sound oh so reasonable. Pity you didn't mention that currently the only recognised way of having those checks is to buy a taxi licence. That licence costs around $30,000 per year.

It is the $30K per year that would make UberX unviable. It has no relationship to the cost of doing those checks. I have no doubt Uber will go to the and say "Look, sure, we can ask the drivers to send us the relevant certificates before we allocate them jobs. A roadworthy (which is what we in Australia call a vehicle inspection) is around $100, and they can sends us the paid insurance bill." The answer will be a resounding no, at which point is will be become obvious it has nothing do to with "safety checks".

One possible explanation of the $30K is it is protection money, charged by the government to protect the incumbents. Who, by the way, meet the definition of a monopoly. Quoting

University of Sydney economist Peter Abelson said Premier and Cabcharge were so interlinked that "it's not really a duopoly, it's almost a monopoly and between them they control about 80 per cent of the cabs on Sydney streets".

A government fining emerging competition to an incumbent monopoly, presumable because of regulatory capture doesn't sound so reasonable, does it? In fact it pisses me off so much, I deliberately travel using these upstarts even if it is less convenient, which it often is.

+ - Google breaks its own reCAPTCHA->

Submitted by ras
ras (84108) writes "Google researchers working on recognising street numbers for Street View pointed their creation at images generated by reCAPTCHA:

To further explore the applicability of the proposed system to broader text recognition tasks, we apply it to synthetic distorted text from reCAPTCHA. reCAPTCHA is one of the most secure reverse turing tests that uses distorted text to distinguish humans from bots. We report a 99.8% accuracy on the hardest category of reCAPTCHA.


Link to Original Source

Comment: Re:Thank you for the mess (Score 5, Informative) 239

by ras (#46710811) Attached to: Heartbleed OpenSSL Vulnerability: A Technical Remediation

For people who didn't follow the link chain, it has since been updated:

Important update (10th April 2014): Original content of this blog entry stated that one of our SeaCat server detected Heartbleed bug attack prior its actual disclosure. EFF correctly pointed out that there are other tools, that can produce the same pattern in the SeaCat server log (see ). I don't have any hard data evidence to support or reject this statement. Since there is a risk that our finding is false positive, I have modified this entry to neutral tone, removing any conclusions. There are real honeypots in the Internet that should provide final evidence when Heartbleed has been broadly exploited for a first time.

Comment: Re:Dear slashdot, (Score 1) 92

by ras (#46709731) Attached to: MtGox's "Transaction Malleability" Claim Dismissed By Researchers

The very short version is that what these "researchers" were looking at isn't actually how the alleged bug would have worked.

That is far too short to be useful.

Mtgox's malleability problem was caused, ironically, by the protocol fixing once source of it. When that happened the network started rejecting mtgox's transactions, in fact they weren't even relayed.

The paper says the were no malleability attacks of the scale mtgox claims because they didn't see the required number of malleable transactions. This would have been reasonable if the attacker also depended on seeing the malleable transactions relayed by the network. But they didn't. Mtgox provided a web site service that allows you to see the transactions mtgox issued, thus allowing the attacker see every malleable transaction.

Thus the attack could have been much larger than what the authors of the paper saw, thus invalidating some of the conclusions of the paper. Particularly the conclusions regarding mtgox, unfortunately.

Comment: Re:sounds like it really was sheer incompetence... (Score 1) 92

by ras (#46709489) Attached to: MtGox's "Transaction Malleability" Claim Dismissed By Researchers

Security bugs in unpatched software are a thing that are well-understood by sysadmins and security researchers.

Really? The bitcoin is valued at several billions of dollars. The reward for breaking Keccak was academic creds. The reward for breaking bitcoin is notoriety for life, and being set for life as well. Besides, you do know that nothing in Bitcoin is encrypted, right? There is one signature and a lot of hashing. There isn't even a nonce.

Additionally, this isn’t an unpatched security flaw where upgrading to Bitcoin 1.1 would have fixed the issue. It’s a weakness inherent to the Bitcoin protocol which may or may not be able to be repaired without invaliding all existing BTC transactions.

Said like a person who is eager to prove he doesn't know much about the subject he is commenting on. It wasn't the upgrade to bitcoin 1.1 that fixed the issue, it was the upgrade to bitcoin 0.9.0. It happened last month. It didn't invalidate anything.

Comment: Nice ... but no clients until 2016 (Score 1) 32

by ras (#46655453) Attached to: New MU-MIMO Standard Could Allow For Gigabit WiFi Throughput

MU-MIMO is part of wave 2 of the 802.11ac standard. Right now every shipping product is wave 1.

If we are lucky the routers will get wave 2 this year, or if not this year definitely next. Apart from allowing more devices to share the same cell MU-MIMO is nice in that it reduces power consumption of devices like phones, as they only see the packets for their stream. Wave 2 also bring doubling of the bandwidth (if the spectrum is available) and other efficiences which translates to 2..3 times the speeds of wave 1. This means unlike wave 1, wave 2 should be able get 1Gb/s in the real world.

All very nice. The only issue is we won't see wave 2 client chips in laptops, phone and the like until 2016 at the earliest. So unless you are doing back to back routers or range extending, don't expect this shiny new Qualcomm chip to make see any measurable improvement in any of your existing 802.11ac devices, or in any you buy in the next 2 years.

Comment: Re:Did they actually look at the bitcoin rules? (Score 1) 301

by ras (#46571795) Attached to: Researchers Find Problems With Rules of Bitcoin

Don't be too sure.... a large botnet could potentially do some nasty things to the availability of the network ---- particularly, a Botnet with control of sufficient number of Bitcoins to generate an overwhelming volume of transaction spam, so legitimate transactions can't get through --- by using transactions of the minimum size, Or more traditional DDoS techniques such as packet storming the IP addresses of key nodes in the Bitcoin network.

A botnet in control of a huge quantity of bitcoin's, throwing them at the miners network in minimal transactions sounds like a miners delight to me. There is a minimum mining fee, so while in the short term it might cause the bitcoin miners to gag on their feast, in the long term all it will do is transfer that huge quantity of bitcoins to the miners. Why on earth would anybody do that?

As for traditional DDoS - the history of bitcoin is one DDoS after another. Just recently some bright spark must have decided that because mtgox said there was a transaction malleability flaw it must be true, and started modifying every transaction they could get their hands on. In other words: if every there was a network battle hardened against DDoS's, it's bitcoin.

Comment: Re:Did they actually look at the bitcoin rules? (Score 1) 301

by ras (#46571727) Attached to: Researchers Find Problems With Rules of Bitcoin

The current block reward is 25 * $577 = $14,425. This is huge compared to the current transactions fees.

Yes, it is huge compared to today's transaction fees. But mining fees will continue for some time yet. The bet is by the time they become insignificant mining fees won't be so small. A clue is the credit card network current roughly 10,000 transaction per second. If bitcoin managed that at 0.6c per kilobyte (the fee bitcoin relays demand) mining fees would be $72,000 per block.

To gain an insight into the odds of that happening, Paypal processes around 9 million transactions per day, or 100 per second. Paypal's revenues were $6.6 billion last year. That translates to Paypal making over $2 per transaction. Bitcoin doesn't offer the same service of course, but it currently charges $0.002 for a single transaction. (A transaction takes roughly 360 bytes).

remember that Bitcoin isn't the only game in town and miners can switch to mining an altcoin if they're not satisfied with the way "bitcoin is supposed to work".

You forget there are users of these coins - be they bitcoins or altcoins. In the end it is the users that pay the mining costs, be they transaction fees or mining rewards. In a word of competing altcoins, this translates to only the users having a vote on what the best set of rules are. What the miners think of the rules is largely immaterial. If you think this isn't true, try and set up a altcoin with spectacular miners rewards and see how many users you get. Maybe you will succeed where all other altcoin founders have failed.

The bitcoin foundation seems to be very aware of this underlying reality, and is behaving accordingly.

Comment: Re:Did they actually look at the bitcoin rules? (Score 3, Informative) 301

by ras (#46571631) Attached to: Researchers Find Problems With Rules of Bitcoin

and you really think all that effort in mining is going to be maintained once the coin pool is exhausted and they are only competing for transaction fees?

Just about all mining is done using ASIC now, and ASIC's are in an unenviable position. Unlike CPU's and GPU's or even FPGA's, they are utterly useless outside of bitcoin. So they will remain deployed until they cost more in power to run than they get in mining fees. This means the current mining power isn't going away any time soon.

Botnet's can earn a return from a variety of sources, not just mining. So the question becomes "is it worth competing against the ASIC's"? In terms of power cost a top end Intel CPU's is roughly 100,000 worse than an ASIC. So even if some miners drop out Botnet's are unlikely to win more than a minor percentage. If the rewards of mining have dropped so much that ASIC's are dropping out, then it's a minor percentage of a small number. Add to that mining's soaking up 100% of CPU time makes an infection by the bot stand out, which decreases the half life of your botnet ... and yeah, I expect it will continue even when there are only transaction fees.

Then there is the whole other question of "does it matter?" If a botnet does take over the mining pool, there is the little issue that bitcoin is intrinsically worth nothing. It's not like they have taken over a pot of gold. Bitcoin is only worth something if people trust it. So if they don't undermine it, they have something that will pay out forever. If they do undermine it, they have got control of 2^51 bits that no one in the right mind would buy and their source of transaction fees has dried up.

It's weird actually. Claiming bitcoin can never succeed because it is worth nothing has to be one of the more popular meme's. The reality is being worth nothing is one of bitcoin's core defences. So far all currencies that have been based on something tangible (like e-Gold) have lacked that defence, and have failed.

The biggest mistake you can make is to believe that you are working for someone else.