Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment: No, it's not crazy (Score 2) 205

by ras (#48551883) Attached to: The Failed Economics of Our Software Commons

Obviously, it would be crazy to staff such critical projects largely with a handful of unpaid volunteers working in their spare time.

The people who do this have a number of reasons. Some do it open source software garners job offers. Some do it because they or the businesses they work for need free software to exist, and it's a self perpetuating loop - the more free software there is the more people contribute to it, so the more they have to chose from. For some it's like attending church - it feels right. For some it's a nice social group to be in. None of these reasons means they or the system they contribute to are crazy.

As for the free loaders - without legions of these "free loaders" free software would not exist. Few would bother to put the effort into Linux, or X, or Debian if there weren't legions of users out there to test it, and give feedback, find bugs, suggest improvements. They are a necessary part of the system. A system that for all its faults, works as least as well as any other commercial way of developing software if you go by deployments.

Comment: Re:A definition of net neturality (Score 1) 200

by ras (#48322309) Attached to: Net Neutrality Alone Won't Solve ISP Throttling Abuse, Here's Why

would US ISPs be required to run trunks across the Gulf of Mexico because you decided you wanted that to have priority? Because using your argument, they really could do that.

Yep. You pretty much nailed it. In a well oiled market if there are enough customers out there who love Netflix so much they are prepared to pay the huge premium an ISP would have to charge in order to cover the cost of running such trunks, then they would exist.

But "required" is too strong a word. In a market functioning well no one requires anybody to do anything, so in this case in particular no one requires an ISP to fill a particular market niche. If the niche opens up then some will, not because anybody requires it, but because it is in their best interest to do so.

Yes, in the US "Net Neutrality" is code for "the government requiring the ISP's to act in a certain way". Yes, that is not a good way to run things - I'd be leery of it too. It's much better to let a market decide. In fact it is so obviously better that most countries had the foresight structured their telecoms so such a market would develop naturally. But not the US, which is why I said the US has cocked it up.

But then again in a well oiled market your whole scenario becomes fanciful, because if Netflix did that another content provider would pop up offering the same content from the US, so their customers didn't need to pay extra to a premium ISP just to see it. Well they would if bandwidth cost the same in the US as it did Honduras. If it cost more our new Netflix would have to recoup it somehow. I guess it could get very complicated, but that's the beauty of a well oiled market - it sorts all this shit out automagically without the need for government interference.

Comment: A definition of net neturality (Score 0) 200

by ras (#48321493) Attached to: Net Neutrality Alone Won't Solve ISP Throttling Abuse, Here's Why

There is a simple definition of Net Neutrality that works: the customer gets to decide the priority of his traffic.

There are many ways you can engineer this outcome, but in all countries that pull it off they do it real simply: every household is serviced by multiple ISP's (at least 10's), and you chose the one you like given your budget.

As usual the US, the supposed beacon of capitalism, has cocked it up. Most homes are serviced by one ISP. And with the power that gives them, not only don't they give you a choice in the priority of your traffic (which admittedly would be a big ask), they erect pay walled gardens, and then they actively interfere with outside traffic to force you to use them. They do this in secret, and seem to have no trouble telling direct outright lies about it when queried.

Being able to bleed monopolistic profits off their customers has made them hugely profitable of course. So to cap it all off they engage in crony capitalism by bribing your politicians with campaign donations to preserve this farce.

I have no idea why voters in the US put up with sort of shit. It boggles the mind.

Comment: Lies, dammned lies and statistics (Score 1) 422

by ras (#48183355) Attached to: Soda Pop Damages Your Cells' Telomeres

It means that you're 96% certain that your hypothesis is true.

Yeah. But if that were really true, everybody would trust the results of a study like this. But no one does.

It's the bar that's used for medical studies.

And in particular, the medical fraternity almost never believes the result from just one study. They always advise waiting for it to be confirmed.

You would be correct if this was a randomly selected study, the issue is it wasn't randomly selected. It was published. Studies that don't meet the 96% interval typically don't get published. So all we know is 1 study out of god knows how many showed this effect. If it is 1 out of 1, the 96% percent applies. But if it was 1 out of 10 it's almost certainly wrong.

Now it's published there are kudo's to be made from shooting it down. Translation: now it's published, it becomes the null hypothesis. A study showing it isn't true is a positive result and now has a chance of getting published.

In other words, the first published statistical survey showing controversial result is barely worth the paper it's written on. It's only real use is to prompt further research.

Comment: Re:China's Uni's have better PR that the US ones (Score 1) 395

by ras (#48166247) Attached to: Battery Breakthrough: Researchers Claim 70% Charge In 2 Minutes, 20-Year Life
Ah, here is a much better description of what they did. In short: their real breakthrough was to grow longer TiO2 nanotubes. They then re-did the work in the previous like to what difference the longer nanotubes made. Turned out it sped up the charge/discharge rate. At 30C they got 6K cycles at 86% capacity, which is where the 20 years comes from. The smaller length tubes had 10K cycles, so it a charge rate versus cycles trade off. As they said, it puts them in super capacitor territory but I'm not sure how that's helpful. We already have super capacitors. We need cheaper batteries.

Comment: China's Uni's have better PR that the US ones (Score 1) 395

by ras (#48166185) Attached to: Battery Breakthrough: Researchers Claim 70% Charge In 2 Minutes, 20-Year Life

I'm struggling to see what's different here to what was done 3 years ago. In other words, self annealing fast charge batteries using a titanium dioxide nanotube anode aren't new. The linked article says capacity for the Na variant of the cell is 144 W.h/Kg, which compares to around 250 for LiPo batteries. The linked article also says they had it working for Li at higher densities.

But it hasn't taken over the world yet, and so there must be some problem with it. Maybe the nanotubes cost a small fortune.

As for those of you whining about charging a car in 5 minutes, maybe it is a bit optimistic. But I'd happily settle for charging my phone in 10 minutes via 100W USB C connection.

Comment: Well done guy's (Score 1) 120

by ras (#47537471) Attached to: Long-range Electric Car World Speed Record Broken By Australian Students

Inspirational. You managed to elicit a dick waving competition from our fellow geeks in the US, all chanting "Tesla".

But Telsa isn't in the same league. It can't be. It's a mass produced product.

Sadly, they don't know what we know. We may be able to design the 1st one. But we can't build the next 1000 economically, unlike Tesla.

Please guys, devote some of they enthusiasm and energy to figure out how to manufacture the thing. Don't do the work for some Chinese company.

Comment: Re:it is the wrong way... (Score 2) 291

by ras (#47480787) Attached to: Australia Repeals Carbon Tax

Why would you assume he would compensate people for something that was being removed?

Maybe because before the election, Abbott promised to keep those tax cuts after repealing the carbon tax?

But you are right, I didn't assume it would stay. At time Abbott was making a whole pile of promises and he could not keep then all - balance the budget, reduce taxes, keep all the benefits those taxes paid for. But by that time hearing him make promises he could not keep was no surprise. It was clear by then the man would say anything, do anything, prostitute anything (including the sexuality of his daughter) in order to get into power.

Amazingly this extraordinary behaviour got worse after he was elected. (Amazing to me anyway. I didn't think it was possible.) First we had a promise to be an open transparent government, then a week or two later we learnt a phrase: "on water matters". Who still remembers the no surprises, no excuses government speech he gave after being elected. Probably not too many, given the shock the first last budget inflicted.

Comment: Re:Dark energy is negative (Score 1) 214

by ras (#47479507) Attached to: Cosmologists Show Negative Mass Could Exist In Our Universe

Dark Matter and Dark Energy are two completely unrelated issues.

To a complete layman like me, it sounds from the ancestor you are posting under they could be very much related:

Negative mass reacts oppositely to both gravity and intertia. Oddly, that means that negative mass still falls down in a gravitational field: The gravitational force is opposite, but negative mass responds negatively to force (a=F/m, where both F and m are negative). So negative mass particles repel each other gravitationally, but are attracted to positive mass objects.

That sounds like a good candidate for explaining both. Space expands because Dark Matter repels itself, but it causes galaxy's to clump and gravitational lensing because it attracts ordinary matter. I did always wonder why, if Dark Matter interacts with everything so weakly, it didn't immediately clump into black holes. This would explain it.

Comment: Re:Fsck x86 (Score 1) 230

by ras (#47193107) Attached to: Intel Confronts a Big Mobile Challenge: Native Compatibility

your claim that x86 has 8-bit mode is false; the lowest common denominator for x86 is the 16-bit 8086, which you're probably confusing with the 8-bit 8080 which is not x86 compatible

He was probably thinking of he 8088, which was an 8086 with an external 8 bit bus. Internally it was identical to an 8086, and so by any reasonable definition it was a 16 bit chip. It was probably the commonest version of 8086 released because it was used by the original IBM PC.

their attempts to match ARM in performance/W are so far unsuccessful when looking at non-biased benchmark results

True, but to have hope of winning the power/watt race currently, they would have to produce a chip that runs as slow an ARM. If things were static they might have be tempted to do that, but they aren't. Instead Moore's law means a OOO superscalar chip will practical on a phone in a few generations. And with it, the power advantage ARM gains form less complex, slower chips will disappear. Once that happens, the overhead imposed by the amd64 instruction will be so small it becomes irrelevant. Intel seems to be content to just wait for that to happen. Or maybe it's more a consequence of not having a choice, because the complexity of x86 did appear to impact the underpowered Atom badly.

Whatever the reason, Microsoft's abandonment of Windows RT hints that simply waiting will work. Microsoft abandoned RT because ARM simply doesn't have the horsepower, and while an i5 does get over a day worth of battery time. So they have already hit the power budget of a tablet. A phone can't be too many generations off.

But with the Mill architecture claiming a 10 to 1 MIPS/Watt advantage, while having the same raw horsepower as an OOO superscalar core, I can't help but wonder if both ARM and amd64 will loose this race in the end.

Comment: Re:Moving goal posts (Score 1) 220

by ras (#47097387) Attached to: PHK: HTTP 2.0 Should Be Scrapped

I don't think HTTP has any problems with security.

I disagree. We live in a world where phishing attacks are common, and the PKI system is fragile. Fragile as in when Iran compromised DigiNotar and people most likely died as a result.

The root cause of both problems is the current implementation of the web insists we use the PKI infrastructure every time we visit the bank, store or whatever. Its a fundamental flaw. You should never rely on Trent (the trusted third party, the CA's in this case) when you don't have to. Any security implementation does the insist you do when you don't have to is broken. Ergo HTTP is broken.

It's not like it isn't fixable. You could insist that on the first visit the site sends you a cert which is used to secure all future connections, and that cert was used only when the user clicked on a bookmark created when the cert was sent. That would fix the "Iran" problem, and it would also allow the web sites to train the users to use the bookmark instead of clicking on random URL's.

So given HTTP security has caused deaths and it's is fixable, I'd say it has HTTP huge problems with security. Given HTTP/2.0 not attempting to fix it is a major fail IMHO.

Comment: Re:Death sentence (Score 1) 255

by ras (#46956751) Attached to: Melbourne Uber Drivers Slapped With $1700 Fines; Service Shuts Down

but it is likely the demands the Directorate will place on Uber drivers, such as mandatory criminal record checks, vehicle inspections and insurance, will make the service in Melbourne unviable.

Those aren't unreasonable demands of someone wanting to carry passengers for hire. They are checks that pretty much the entire Western world has come up with after numerous problems with unsafe, uninsured and unsavoury taxi drivers. If this is enough to make Uber unviable, then I wouldn't want to be one of their investors.

You sound oh so reasonable. Pity you didn't mention that currently the only recognised way of having those checks is to buy a taxi licence. That licence costs around $30,000 per year.

It is the $30K per year that would make UberX unviable. It has no relationship to the cost of doing those checks. I have no doubt Uber will go to the and say "Look, sure, we can ask the drivers to send us the relevant certificates before we allocate them jobs. A roadworthy (which is what we in Australia call a vehicle inspection) is around $100, and they can sends us the paid insurance bill." The answer will be a resounding no, at which point is will be become obvious it has nothing do to with "safety checks".

One possible explanation of the $30K is it is protection money, charged by the government to protect the incumbents. Who, by the way, meet the definition of a monopoly. Quoting http://www.smh.com.au/technology/technology-news/apps-put-nsw-taxi-monopoly-in-doubt-20121102-28nv6.html:

University of Sydney economist Peter Abelson said Premier and Cabcharge were so interlinked that "it's not really a duopoly, it's almost a monopoly and between them they control about 80 per cent of the cabs on Sydney streets".

A government fining emerging competition to an incumbent monopoly, presumable because of regulatory capture doesn't sound so reasonable, does it? In fact it pisses me off so much, I deliberately travel using these upstarts even if it is less convenient, which it often is.

+ - Google breaks its own reCAPTCHA->

Submitted by ras
ras (84108) writes "Google researchers working on recognising street numbers for Street View pointed their creation at images generated by reCAPTCHA:

To further explore the applicability of the proposed system to broader text recognition tasks, we apply it to synthetic distorted text from reCAPTCHA. reCAPTCHA is one of the most secure reverse turing tests that uses distorted text to distinguish humans from bots. We report a 99.8% accuracy on the hardest category of reCAPTCHA.

"

Link to Original Source

Comment: Re:Thank you for the mess (Score 5, Informative) 239

by ras (#46710811) Attached to: Heartbleed OpenSSL Vulnerability: A Technical Remediation

For people who didn't follow the link chain, it has since been updated:

Important update (10th April 2014): Original content of this blog entry stated that one of our SeaCat server detected Heartbleed bug attack prior its actual disclosure. EFF correctly pointed out that there are other tools, that can produce the same pattern in the SeaCat server log (see http://blog.erratasec.com/2014... ). I don't have any hard data evidence to support or reject this statement. Since there is a risk that our finding is false positive, I have modified this entry to neutral tone, removing any conclusions. There are real honeypots in the Internet that should provide final evidence when Heartbleed has been broadly exploited for a first time.

...there can be no public or private virtue unless the foundation of action is the practice of truth. - George Jacob Holyoake

Working...