Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:https "evRywhr" is 4 sites, not so much, Users. (Score 1) 43

> hosts file or client-side tracking blocker extension works for HTTPS
> just as well as for cleartext HTTP.
---
You can't use a hosts file to selectively block content. I've already stated, that to cache or to block, you need to know the object-type and size. You don't get that w/HTTPS.

> There are anecdotal reports that HTTP/2 over TLS can have less latency
> than cleartext HTTP/1.1. So if you add HTTP/2 to your MITM, you may be
> able to mitigate some of the TLS overhead.
---
Interesting, but it would be highly dependent on type of traffic. HTTP/2 was supposed to help response time by combining multiple requests, including allowing for combining requests from divers sources, so it would be unsurprising if it worked under some traffic loads. This is especially true compared to uncached cleartext.

However, I doubt HTTP2 proponents would be interested in doing benchmarks where 33% of the cleartext HTTP requests had 0 latency due to being locally cached.

Maybe it goes w/o saying, but combining the requests is the opposite of what would be necessary to block or locally cache 33% of the content.

Comment Re:https "evRywhr" is 4 sites, not so much, Users. (Score 1) 43

> That's true only if your ISP is using an intercepting proxy.
---
Right -- they are a large corporation. You don't think they couldn't be ordered to do so and say nothing under the Patriot act? Do you disbelieve that root-ca's in the US or other monitoring countries couldn't be forced to give out subordinated CA's to install @ ISP monitoring sites?

> Blocking "by site" is still possible with HTTPS...blocking at a finer level than "by
> site" or "intermediate caching" still requires MITM.

I've always blocked by site and media type and for any unclarities, I looked at the http code. That's no longer possible unless a user sets up MITM proxying that
lowers security for all https sites (finance, et al.). While I can install exceptions to
whitelist sites that shouldn't have content cached, they are still decrypted.

One has to know content type and size to effectively cache anything. Right now, going back for the past 3500 requests, I see stats of:
(mem = in-squid-memory)
mem: 8% (313/3514), 16% (11M/70M)
dsk: 23% (842/3514), 10% (7.2M/70M)
tot: 32% (1155/3514), 26% (19M/70M)
& for double that:
mem: 5% (367/7025), 9% (12M/126M)
dsk: 21% (1523/7025), 14% (18M/126M)
tot: 26% (1890/7025), 23% (29M/126M)
---
without MITM caching, those numbers drop to near 0 for HTTPS sites. Those cached objects serve for multiple browsers, OS's, machines and users. Losing ability to cache 25-30% hurts interactive use and raises latency. Simply by going w/HTTPS instead of HTTP creates increased server load and increased network latencies. Sites that provide many static images can be affected more heavily. But my local network cache provides 128G of space (55% used) and can store large iso images that can be reserved months later. W/my monthly traffic, 25% space savings can easily run in the 500G range which is, by itself,
well over many ISP imposed limits before extra charges kick in.

> Intercepting proxies cache HTTPS only if the user has chosen to trust the proxy.
----
Which is why converting most traffic to HTTPS instead of HTTP hurts caching proxies the most and allow easier tracking by sites like google. From the time I connect to some sites, till I leave, google, et al, have encrypted connections
going. They can easily track sites and where I'm at on the site, w/o doing any special MITM interceptions using fed-provided CA's from US-based CA-authorities.

My interest has been in promoting faster browsing experience (something I've had success in, given feedback from those using the MITM proxies), as well as increasing privacy by blocking sites based on what sites they are being called or referenced from. You can't do that if the site you are connecting to is HTTPS based.

I see no benefit for HTTPS for "normal usage" -- only harm for the user and benefits for the sites -- especially large, data collection sites like google.

Comment Re:https "everywhere" is 4 websites, not so much U (Score 1) 43

cleartext HTTP .. there are no routers on the path that aren't capable of playing MITM. What do I care if they "see" what kernel version I download or open source project I download. Who cares if they see the articles I am reading/writing on slashdot.

There is no improvement as google knows all the traffic as it tied into almost every site and HTTPS doesn't help a bit. And they in turn can hand the info over to any gov agency that asks for it -- and be forced not to tell you about it.

HTTPS is a wet-security blanket.

public key pinning? No -- you can intercept the traffic at the ISP level -- I'm sure larger ISP's can get a root-cert. When you connect to an encrypted site, you really connect to your ISP's pass-through traffic decoder, which then passes another encrypted circuit on to wherever you were going.

HTTPS safety is an "illusion" to get you to use it so you can't easily be selective about what you block or cache by site.

Caching rate on HTTP sites -- 10-30 or higher %, on HTTPS -- 0%, and there's the overhead of encrypting.

Comment Re:https "everywhere" is 4 websites, not so much u (Score 1) 43

That's what I meant by https "everywhere" harming security for those sites that have a legitimate need for it. By implementing a MITM proxy, it makes all https streams less secure. I don't like that trade-off (not that I don't already have such
implemented for myself).

At the same time, google is pushing for "certificate transparency"
(https://www.certificate-transparency.org/what-is-ct ), that might not let home-user issued certs be used for such purposes --- not sure. The more internal-proxies that implement MITM HTTPS for their internal needs/wants, the more pressure
those not wanting those streams to be easily visible or cacheable will work to disable that "hole"... (IMNSHO)...

Comment https "everywhere" is 4 websites, not so much usrs (Score 1) 43

https on "social" sites (non bank/finance/medical...etc ones traditionally needing encryption), mostly benefits the site -- not so much most user. It usually harms users more than not as it prevents content caching and local-filtering. On a https site, I can cache near zero in my squid proxy (used by more than one account & user). That allows much tighter tracking of individuals as they go from site to site.

On news and discussion sites, I can easily get over 25% of the requests satisfied locally -- and housemates notice the difference -- especially on things like heavily thumbnail'd sites like youtube.

Think twice about https everywhere, as it ends up with the standard being that gateway owners (companies or individuals) need to install ways to overcome that.

That harms "sensitive-sites" that really should use https, (finance, medical, etc).

Comment Re:FB is a de facto monopoly, just like Microsoft (Score 1) 65

I was wondering how FB's actions are not anti-competitive, but it's because they don't pretty much own the market like MS did at that time. Due to the ultra-conservatives having disposed of the old FTC and replaced them with people who's only qualification was to support the, then, current administration, it makes it less likely that they would even know what to do if they wanted to.

Comment Re:entrapment (Score 1) 176

And that's why we have a court system. ;^)

@ http://legal-dictionary.thefre..., they say this (among many other things):

"The [entrapment] defense is not available if the officer merely created an opportunity for the commission of the crime by a person already planning or willing to commit it."

If someone has never seen a child porn site, but stumbles upon one, and is curious about why someone would find such things sexual, and stumbles upon such a site due to their being over 300% more sites (due to police running them), they might satisfy their curiosity, whereas if they had to search for such a site, they wouldn't have ever visited -- as that would be too much work.

It really depends on how such sites are found and how much a 300% increase in site availability would affect those who only have idle curiosity about such matters.

I can't help but think about how the feds, at one point, told the states that they couldn't allow legal, intrastate cannabis, as it would affect prices outside of the state and would thus be affecting interstate commerce. It was seen as a simple matter of supply & demand, with higher supply resulting in lower prices.

That was considered to be a "given". Here, if the cop-operated sites were a significant percentage of all such sites, I can't help but feel they would attract people that would otherwise have not been planning or willing to put in the work to find such a site and would have not, otherwise, visited such a site.

But I can see you firmly disagree and like I said -- that's why we have
judges (and juries).

Comment Re: entrapment (Score 1) 176

Article said they took down 1 but operated it for 13 days to entice more. Fine. But then it said they got permission to operate 23 such sites. To me that sounded like they used the 1 site and made 22 copies of it. There was nothing to indicate they got the 22 copies by busting 22 different servers -- at least not from what the article said.

Why would they need to get permission to operate 23 such sites when it sounds like they needed no permission at all to bust 1 site and operate it for 2 weeks after the bust. No -- they needed permission because they wanted to setup 22 more copies of the site. But taking down a site and not turning it off to uncover any related activity for 2 weeks is well within the scope of an investigation not needing pre-approval. The fact that they did need pre-approval indicated advance planning and wanting to setup 22 copies to go after unrelated traffic by setting out more sites to create a wider, more enticing net.

Doesn't sound ethical to me.

Comment Re: entrapment (Score 1) 176

That's what I was thinking -- if the FBI was supplying a significant amount of porn sites compared to what is normally there, it is likely to have an effect.

Compare this to the argument that that intra-state sales of something affects inter-state pricing because intra-state sales affects availability. If the FBI is increasing availability, doesn't that mean it's lowering the difficulty in randomly coming across such a site and being tempted by it?

If the FBI increases supply by 383% (23 more sites compared to original 6), then it seems it would be considerably easier to find such a site by whatever method the news of such sites existing is spread. That HAS to bring in more people than the original 6 sites.

Note -- if the FBI offered only 1-2 sites added to original 6, the increased availability would be noticeable, but, *arguably* *necessary* to carry out the operation. But providing a 383% increase in the availability of such sites -- 80% of all such sites, they have to realize they are likely to advertise it to people who never would have known about such. The FBI is increasing the number of offenders by reaching a larger audience. That seems unethical, at least, and should, IMWO (in my worthless opinion, :-)), not be legal.

Comment Re:If you can afford a [fleet of] Teslas... (Score 1) 305

But say you bought a fleet of them just to rent out as a taxi service.

In fact, why not have a fleet of them be a taxi service? No people required.

That's related to this article. It was talking about who get to own the fruits of automation -- if I own a fleet of androids/robots and hire them out as lawyers or doctors (and they are fully trained and knowledgeable about whatever field I program them with), do I get full benefit from their employment?

What if I bought them from a tesla -- do they get to claim they are only licensing me the training SW that allows them to do the job, and therefore they can require I pay them a share?

Of course the above completely ignore the question of who developed the software or robots/androids and the likelihood that they are not unemployed because someone invented SW to write new SW for chosen professions that ended up putting themselves out of business... etc... of course, their "employers" own all the fruits of their labor... that's how capitalism works. You hire "workers" to do work for you, and you get the output of their work even if it is something that will permanently replace them. In other words, if they were worth $120K/year as a SW developer and @ age 25 developed SW that permanently unemploys them, is it really "fair" that they get nothing?

Note -- not talking about what is "legal", or what the employer can get away with, since that can change based on law. Example, it used to be the case in California, that employers could claim to own anything you developed while you were employed with them, though now, if you develop something outside work on your own time and equipment, then they don't automatically own your outside work (not true in all states and surely not all countries).

Nevertheless -- that law changed, so what was legal to do 30-40 years ago may not be legal now or tomorrow...

Slashdot Top Deals

Suggest you just sit there and wait till life gets easier.

Working...