Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: Re:What is wrong with SCTP and DCCP? (Score 1) 33

by swillden (#49503565) Attached to: Google To Propose QUIC As IETF Standard

SCTP, for one, doesn't have any encryption.

Good, there is no reason to bind encryption to transport layer except to improve reliability of the channel in the face of active denial (e.g. TCP RST attack).

I disagree. To me there's at least one really compelling reason: To push universal encryption. One of my favorite features of QUIC is that encryption is baked so deeply into it that it cannot really be removed. Google tried to eliminate unencrypted connections with SPDY, but the IETF insisted on allowing unencrypted operation for HTTP2. I don't think that will happen with QUIC.

But there are other reasons as well, quite well-described in the documentation. The most significant one is performance. QUIC achieves new connection setup with less than one round trip on average, and restart with none... just send data.

Improvements to TCP helps everything layered on top of it.

True, but TCP is very hard to change. Even with wholehearted support from all of the major OS vendors, we'd have lots of TCP stacks without the new features for a decade, at least. That would not only slow adoption, it would also mean a whole lot of additional design complexity forced by backward compatibility requirements. QUIC, on the other hand, will be rolled out in applications, and it doesn't have to be backward compatible with anything other than previous versions of itself. It will make its way into the OS stacks, but systems that don't have it built in will continue using it as an app library.

Not having stupid unnecessary dependencies means I can benefit from TLS improvements even if I elect to use something other than IP to provide an ordered stream or I can use TCP without encryption and not have to pay for something I don't need.

So improve and use those protocols. You may even want to look to QUIC's design for inspiration. Then you can figure out how to integrate your new ideas carefully into the old protocols without breaking compatibility, and then you can fight your way through the standards bodies, closely scrutinized by every player that has an existing TLS or TCP implementation. To make this possible, you'll need to keep your changes small and incremental, and well-justified at every increment. Oh, but they'll also have to be compelling enough to get implementers to bother. With hard work you can succeed at this, but your timescale will be measured in decades.

In the meantime, QUIC will be widely deployed, making your work irrelevant.

As for using TCP without encryption so you don't have to pay for something you don't need, I think you're both overestimating the cost of encryption and underestimating its value. A decision that a particular data stream doesn't have enough value to warrant encryption it is guaranteed to be wrong if your application/protocol is successful. Stuff always gets repurposed and sufficient re-evaluation of security requirements is rare (even assuming the initial evaluation wasn't just wrong).

TCP+TFO + TLS extensions provide the same zero RTT opportunity as QUIC without reinventing wheels.

Only for restarts. For new connections you still have all the TCP three-way handshake overhead, followed by all of the TLS session establishment. QUIC does it in one round trip, in the worst case, and zero in most cases.

There was much valid (IMO) criticism of SPDY, that it really only helped really well-optimized sites -- like Google's -- to perform significantly better. Typical sites aren't any slower with SPDY, but aren't much faster, either, because they are so inefficient in other areas that request bottlenecks aren't their problem, so fixing those bottlenecks doesn't help. But QUIC will generally cut between two and four RTTs out of every web browser connection. And, of course, it also includes all of the improvements SPDY brought, plus new congestion management mechanisms which are significantly better than what's in TCP (so I'm told, anyway; I haven't actually looked into that part).

I'm not saying the approach you prefer couldn't work. It probably could. In ten to twenty years. Meanwhile, a non-trivial percentage of all Internet traffic today is already using QUIC, and usage is likely to grow rapidly as other browsers and web servers incorporate it.

I think the naysayers here have forgotten the ethos that made the Internet what it is: Rough consensus and running code first, standardization after. In my admittedly biased opinion (some of my friends work on SPDY and QUIC), Google's actions with SPDY and QUIC aren't a violation of the norms of Internet protocol development, they're a return to those norms.

Comment: Too busy to rip the radio out of my car (Score 2) 157

by EmperorOfCanada (#49502329) Attached to: Norway Will Switch Off FM Radio In 2017
It is only the fact that have been too busy to rip the radio out of my car. I have a screen/computer to put into it that will then play lectures, audio books, podcasts, etc. Also I have it ready to replace my dashcam with a series of cameras that not only can record but also upload via a dataplan if needed.

At no point in my buying did I even look for an FM or even AM option to add on. And certainly I never looked for a satellite radio technology (those things just piss me off in rentals).

To me even satellite radio is so 20th century. DAB is also just a bandaid to try to keep the radio station media companies relevant.

But the reality is that this isn't a technology issue. For the last portion of the 20th century a variety of media conglomerates bought up all the radio stations and turned them into MBA masturbatory dreams. All profit with no content. About the last time I listened to radio was just before a DJ that I know told me that his new format was to go into work, record all his blurps between songs in one long scripted 1.5 hour session including interviews, and then go home. The songs and his blurps were all run automatically by the computer.

The few things that come off NPR, BBC, or the CBC that I care about "Art of persuasion, quirks, this american life, etc" I download. But even the CBC is just on a march further and further to the PC left and I can't stomach having one great feature cut short so they can give massive amounts of time to someone with some extreme view on some stupid social issue and listen to them grind their axe endlessly.

So the best of radio on today is worse than silence. But my own playlist is awesome and the technology is sitting in a drawer so that I don't have to use my stupid FM transmitter to get crap off my iPhone.

So like my car not coming with an ashtray, I want my next car to not come with a radio, DAB or not.

Comment: Re:Simple (Score 2) 189

by swillden (#49502187) Attached to: Ask Slashdot: What Features Would You Like In a Search Engine?

False analogy. There's a huge difference between a personal assistant, who by definition *I* know personally, and a faceless business entity who I know not at all (read adversarial entity) scraping 'enough' information about me to presume it knows me sufficiently to second guess what I want and give me that instead of what I requested.

Not really.

I'd say there's a good argument that all of the information I give Google actually exceeds what a personal assistant would know about me. The real difference (thus far) lies in the assistant's ability to understand human context which Google's systems lack. But that's merely a problem to be solved.

Note, BTW, that I'm not saying everyone should want what I want, or be comfortable giving any search engine enough information to be such an ideal assistant. That's a personal decision. I'm comfortable with it... but I'm not yet getting the search results I want.

Comment: Re:Simple (Score 1) 189

by swillden (#49502045) Attached to: Ask Slashdot: What Features Would You Like In a Search Engine?

Why would I want crappy results? I want it to give me what I want, which by definition isn't "crappy".

And you think a system built by man can divine what you and everyone else wants at the moment you type it in? That'll be the day. Until then, assume I know what I want and not your system.

I think systems built by man that knows a sufficient amount about me, my interests and my needs can. We're not there yet, certainly, but the question was what I want... and that's it.

Put it this way: Suppose you had a really bright personal assistant who knew pretty much everything about you and could see what you are doing at any given time, and suppose this assistant also had the ability to instantly find any data on the web. I want a search engine that can give me the answers that assistant could.

Comment: Yellow pages, huffington post... both gone! (Score 1) 189

by EmperorOfCanada (#49501637) Attached to: Ask Slashdot: What Features Would You Like In a Search Engine?
First I want a toggle that I can never ever ever ever ever see that domain in my search results. So ask.com answers.com experts-exchange.com huffpo and especially Quora you fucking turd pile of shot Quora; I never want to see Quora again in my life.

When I want a little pizza joint or some place that hasn't hired an "SEO" guy all I get are page after page of directories derived from some government database or some crap like that. Their actual page is bottom ranked. I don't want review sites. I don't want anything that was assembled by a machine.

So a simple rule of thumb is de-list any page that offers to "upgrade" someone's listing. Full stop. Also I want a toggle that will remove listings that have any version of "upgrade to our pro service" Literally they could cure cancer but offer to cure it 1 minute faster for 99 cents and I don't want to see that page.

To me right now nearly the entire search results are like going to a dating site and only finding hookers. Some people would argue that they "need" to make money but they don't. There are lots of pages that exist for a specific reason and many of those pages are commercial, as in they offer a specific service such as a pizza places where the page is about their pizza place. Short of the recipes the page is 100% free. But I don't want some shit "Just Eat" website. Maybe they can link to the other page but I really don't want to see it ever again. For instance I loved allrecipes.com. But now it is just upgrade upgrade upgrade upgrade. Some will argue that they should be allowed to make money but quite simply the site existed before some MBA took over and "monetized" the site that's fine, I no longer want it to turn up in my search results. Don't ban it from the entire search engine, just ban it from the search engine when you tick the "No upgrade sites" option.

The other thing that I would kill for is a negative feature option. So any site that uses discus would vanish from my search results. Those scumbags need to burn in hell and I would love any search engine that sent them there.

To me there is a huge opportunity for some new search engine to do to Google what they did to all the others 17 years ago; completely make them irrelevant by brutally ignoring the wishes of the larger websites and completely focusing on the needs of the average user.

Comment: Re:Does it report seller's location and ID? (Score 1) 140

by swillden (#49501287) Attached to: Google Helps Homeless Street Vendors Get Paid By Cashless Consumers
Sure, but that requires only very coarse -- city-level, at most -- geolocation. If I were reviewing this product for launch, I'd tell them that they can use location as a risk signal, but must coarsen it to avoid making it possible to use it for people-tracking.

Comment: Re:What the fuck is the point of the ISP middleman (Score 1) 41

by swillden (#49501277) Attached to: Google Ready To Unleash Thousands of Balloons In Project Loon

If local ISPs are involved, then what the fuck is the point of this?

Not really ISPs, at least as we traditionally think of them. Mobile network operators.

Why the fuck is there still this useless ISP middleman?

The MNO in question isn't the middleman, it's the service provider. It provides service to the balloons, which relay it to regions that are too remote to service now.

For crying out loud, this whole problem exists in the first place because the local ISPs weren't able or willing to invest in the infrastructure needed to provide Internet access to these regions.

No, most of these regions aren't served because it's uneconomical. It's not that no one is willing to invest, it's that it's not an "investment" if you know up front that the ROI will be negative. Putting up a bunch of cell towers to serve remote African farmers, for example, doesn't pan out economically because there's no way the farmers can afford to pay high enough fees to cover the costs of all the infrastructure. Project Loon aims to fix this by radically lowering the cost of serving those regions, to a point where it is economical, so the fees the people in the region can afford to pay are sufficient to make serving them profitable.

As for why Google is partnering with MNOs rather than deploying their own connectivity? I don't know but I'd guess a couple of reasons. First, I expect it will be feasible to scale faster by partnering with entities who already have a lot of the infrastructure in place, particularly when you consider all of the legal and regulatory hurdles (which in many areas means knowing who to bribe, and how -- Google, like most American companies, would not be very good at that). Second, by working through local companies Google will avoid getting into power struggles with the local governments. Google is helping their local businesses to grow, not replacing them.

(Disclaimer: I work for Google, but I don't know anything more about this than what I see/read in the public press.)

Comment: Re:vs. a Falcon 9 (Score 1) 56

by Bruce Perens (#49501071) Attached to: Rocket Lab Unveils "Electric" Rocket Engine

They can carry about 110kg to LEO, compared to the Falcon 9's 13150kg. That's 0.84% of the payload capacity. A launch is estimated to cost $4 900 000, compared to the Falcon 9's $61 200 000. That's 8.01%. That means cost per mass to orbit is nearly an order of magnitude worse.

Yes, this is a really small rocket. If you are a government or some other entity that needs to put something small in orbit right away, the USD$5 Million price might not deter you, even though you could potentially launch a lot of small satellites on a Falcon 9 for less.

And it's a missile affordable by most small countries, if your payload can handle the re-entry on its own. Uh-oh. :-)

Comment: Seeing that they can use secret courts... (Score 1) 119

by EmperorOfCanada (#49500177) Attached to: Twitter Moves Non-US Accounts To Ireland, and Away From the NSA
Seeing that they can use secret courts I would suspect that they will order Twitter employees to just hand the data or access over anyway. Then when they balk it can be handled in a secret court where nobody knows the results. Even better I could see a situation where they identify an employee or two and order them to hand the data over and not even allow them to tell twitter about the court order (if they can't tell some people then why can't these orders be restricted to their boss as well?)

Lastly they could just get an overqualified NSA employee to get a job there and just inject the needed back doors. Don't think of this as a lone hacker attack but a single guy who has a massive support team thus someone who could do off the scale things like swap out someone's desktop/laptop (spaghetti stains and all) with a compromised machine. Let the "cleaners" in so that they can wire their own fibreoptic cables right into the server room, swap out pretty much anything cisco with compromised machines with matching serial numbers, etc.

Also by moving the servers offshore it actually frees up the NSA to attack with even fewer legal restrictions. So full on sabotage may even be a perfectly valid procedure.

So short of eliminating all American employees and doing exhaustive background checks the only way to stop this stuff from being done to them is to convince legislators to curtail what the NSA can actually do.

Comment: Re:What's the problem? (Score 1) 180

by swillden (#49497259) Attached to: Social Science Journal 'Bans' Use of p-values

There really aren't any good ways to measure those other effects. If you knew how your experiment was biased, you'd try and fix it.

Randomized sampling goes a long way, but only if you have a large enough population. This is one of the problems of social sciences. A randomized 10% subsample from 100 subjects ain't gonna cut it. A randomized subsample from 10,000,000 people isn't going to get funded.

Why wouldn't a randomized subsample from 10M people get funded? The required sample size doesn't grow as the population does.

Comment: Re:What's the problem? (Score 4, Insightful) 180

by swillden (#49495247) Attached to: Social Science Journal 'Bans' Use of p-values

Actually, p-values are about CORRELATION. Maybe *you* aren't well-positioned to be denigrating others as not statistical experts.

I may be responding to a troll here, but, no, the GP is correct. P-values are about probability. They're often used in the context of evaluating a correlation, but they needn't be. Specifically, p-values specify the probability that the observed statistical result (which may be a correlation) could be a result of random selection of a particularly bad sample. Good sampling techniques can't eliminate the possibility that your random sample just happens to be non-representative, and the p value measures the probability that this has happened. A p value of 0.05 means that there's a 5% chance that your results are bogus in this particular way.

The problem with p values is that they only describe one way that the experiment could have gone wrong, but people interpret them to mean overall confidence -- or, even worse -- significance of the result, when they really only describe confidence that the sample wasn't biased due to bad luck in random sampling. It could have been biased because the sampling methodology wasn't good. I could have been meaningless because it finds an effect which is real, but negligibly small. It be meaningless because the experiment was just badly constructed and didn't measure what it thought it was measuring. There could be lots and lots of other problems.

There's nothing inherently wrong with p values, but people tend to believe they mean far more than they do.

"Someone's been mean to you! Tell me who it is, so I can punch him tastefully." -- Ralph Bakshi's Mighty Mouse

Working...