Forgot your password?

Comment: Re:Alternative explanation (Score 1) 393

by Alsee (#47541327) Attached to: Enraged Verizon FiOS Customer Seemingly Demonstrates Netflix Throttling

The sending provider pays the receiving provider for the bandwidth, and this is the only rational way it can be.

Right..... because when Verizon customer's pay for internet connection service, and Verizon customers request pages and media from Wikipedia.... Wikipedia should pay Verizon. That totally makes sense. On crack.

packets originating on their network

Everything is originating on Verizon's network..... Verizon customer's are the ones wanting to open a connection to Netfix and request the data.

When I make a phonecall to someone, and I spend 99% of the call listening to what that person has to say, NO ONE is going to buy that my local phone company can SEND A BILL TO THE PERSON I CALLED.


Comment: Re:Alternative explanation (Score 1) 393

by Alsee (#47541259) Attached to: Enraged Verizon FiOS Customer Seemingly Demonstrates Netflix Throttling

Level3 is trying to charge Verizon an exorbitant rate for enough bandwidth to handle that peer. Verizon said "No"

No. Level3 offered to upgrade the connection FOR FREE. Level3 offered to pay 100% of the cost of the extra hardware to upgrade the link and GIFT it to Verizon.

The second part of your comment was correct.... the part about Verizon saying "No". Verizon doesn't want the problem fixed for free - Verizon wants to use their monopoly position to bottleneck their customer's datastreams, to try to extort a slice of the content-revenue-stream pie.

Verizon has plenty of bandwidth, Netflix has plenty of bandwidth

Yep. Verizon themselves put out a graphic showing that there's abundant bandwidth, and that the entire problem is the one chokepoint where they're linked to Level3. Which Level3 offered to foot 100% of the bill of fixing.


Comment: Re:Could be a different route involved for the VPN (Score 1) 393

by Alsee (#47541207) Attached to: Enraged Verizon FiOS Customer Seemingly Demonstrates Netflix Throttling

a small step away from saying that Verizon should provide free internet services for every service their customers request.

Screw "a small step away".
Verizon should provide free internet services for every service their customers request.

The customer is paying for internet service, and the ISP goddamn well needs to round-trip delivery of the customer's internet data, up to the quantity and speed THAT THE CUSTOMER PAYED FOR.

The truly insane thing here is that Level3 has gone to the absurd length of offering to pay 100% of the cost GIFTING Verizon with the additional network cards and cables to expand the link and fix the problem. Verizon refused. Verizon isn't happy being a network provider - they see the revenue Netflix and others gets being a content providers, and Verizon doesn't want the connection problem fixed for free.... Verizon wants to extort Netflix to give them a permanent revenue stream from the content pie. Verizon is abusing their monopoly power to bottleneck customer's data.... trying to force Netflix to raise prices and pay that extra money as a KICKBACK to Verizon. Verizon is abusing their monopoly position to try to gouge their own customers - and trying to force Verizon's price-gouging to show up on customer's Netflix bills rather than appearing on Verizon's own bills.


United States

When Spies and Crime-Fighters Squabble Over How They Spy On You 120

Posted by timothy
from the we-may-or-may-not-have-done-that dept.
The Washington Post reports in a short article on the sometimes strange, sometimes strained relationship between spy agencies like the NSA and CIA and law enforcement (as well as judges and prosecutors) when it comes to evidence gathered using technology or techniques that the spy agencies would rather not disclose at all, never mind explain in detail. They may both be arms of the U.S. government, but the spy agencies and the law enforcers covet different outcomes. From the article: [S]sometimes it's not just the tool that is classified, but the existence itself of the capability — the idea that a certain type of communication can be wiretapped — that is secret. One former senior federal prosecutor said he knew of at least two instances where surveillance tools that the FBI criminal investigators wanted to use "got formally classified in a big hurry" to forestall the risk that the technique would be revealed in a criminal trial. "People on the national security side got incredibly wound up about it," said the former official, who like others interviewed on the issue spoke on condition of anonymity because of the topic’s sensitivity. "The bottom line is: Toys get taken away and put on a very, very high shelf. Only people in the intelligence community can use them." ... The DEA in particular was concerned that if it came up with a capability, the National Security Agency or CIA would rush to classify it, said a former Justice Department official.
Data Storage

Intel Launches Self-Encrypting SSD 91

Posted by Soulskill
from the masochistic-storage-devices dept.
MojoKid writes: Intel just launched their new SSD 2500 Pro series solid state drive, the follow-up to last year's SSD 1500 Pro series, which targets corporate and small-business clients. The drive shares much of its DNA with some of Intel's consumer-class drives, but the Pro series cranks things up a few notches with support for advanced security and management features, low power states, and an extended management toolset. In terms of performance, the Intel SSD 2500 Pro isn't class-leading in light of many enthusiast-class drives but it's no slouch either. Intel differentiates the 2500 Pro series by adding support for vPro remote-management and hardware-based self-encryption. The 2500 Pro series supports TCG (Trusted Computing Group) Opal 2.0 features and is Microsoft eDrive capable as well. Intel also offers an administration tool for easy management of the drive. With the Intel administration tool, users can reset the PSID (physical presence security ID), though the contents of the drive will be wiped. Sequential reads are rated at up to 540MB/s, sequential writes at up to 480MB/s, with 45K – 80K random read / write IOps.

Comment: Re:Sounds like Swordfish (the movie). (Score 1) 435

by khasim (#47476389) Attached to: FBI Concerned About Criminals Using Driverless Cars

Of course I am postulating that a hacker can break it.

No. You are postulating that a hacker that can break it WOULD TURN TO CRIME INSTEAD OF MAKING $150,000+ A YEAR WORKING FOR A COMPANY THAT MANUFACTURES THOSE CARS.

Why would the car be the only computer in human creation immune to hacking you completely absurd asshat?

No one except you has claimed that.

I'm saying that the skills needed to crack that system are very rare AND very valuable IN LEGITIMATE BUSINESS SETTINGS.

So WHY would someone who could make a lot of money LEGALLY use those very rare skills in a crime? Why would that person WANT to become a criminal?

Comment: Sounds like Swordfish (the movie). (Score 1) 435

by khasim (#47470923) Attached to: FBI Concerned About Criminals Using Driverless Cars

You do not need the skill to program. You just need the leverage to make someone who has the skills do it for you.

Yeah, just like in the movie. Swordfish.

Why "swordfish"? Because the password is always "swordfish".

Once it is done once, it becomes much easier to do a second time.

You are still postulating a hacker that can crack the protections that Google's programmers have put around the code already.

Additionally, now you are also required to:

a. learn which of the hackers in the world is capable of defeating those protections/re-programming the vehicle

b. force/entice that hacker to do so

c. prevent that hacker from selling the exploit to Google before you've completed your crime(s)

And once it is done it will become MORE difficult because Google will issue a patch or recall to prevent it.

Comment: Re:Drug mule? How? (Score 1) 435

by khasim (#47470639) Attached to: FBI Concerned About Criminals Using Driverless Cars

Dammit, I never rented that driverless car. Yes, I know that it was my credit card and I hadn't reported it stolen, but it wasn't me!

Simple denials do not work with the police. Particularly if you can be placed at the same location as the autonomous car was.

paid for rentals aren't generally reported as stolen.

You are using your credit card to rent a vehicle that will be carrying illegal drugs. That is not a good idea if you do not want to be caught.

Unless you have data to show it is significantly anomolous, it is irrelevant.

No. It would be anomalous. Unless vehicle usage changes dramatically once autonomous cars are introduced. Unoccupied vehicles between cities would probably not be the norm.

random stops?

Not "random". The vehicle is stopped because it is suspicious. The reason it is suspicious is because it is between cities without an occupant.

Cops don't just go pulling over and searching vehicles on a random basis.

Again, not "random". See above.

And again, drug mules are only effective if they appear to belong to a category that the police are not interested in. If YOU can think that an unoccupied car would be a good drug mule then the police can think the same thing.

Comment: I doubt it. (Score 2) 435

by khasim (#47470461) Attached to: FBI Concerned About Criminals Using Driverless Cars

... despite of me being an engineer, and a computer scientist, ...

Okay, so you claim to be an engineer AND a computer scientist. That means a LOT of math classes for you.

A driverless car cannot stop within abrupt short time.

Yes it can. That's basic math. Stopping distance is determined by 3 things:

1. reaction time (computers are quicker than humans)

2. speed

3. surface conditions

So the autonomous car should stop in a shorter distance than a human would.

Just one, one only, example: If presented by either hitting a 4-year-old child or an octogenarian; ...

Someone with a degree in computer science should know that computers only run programs. Therefore, SOMEONE would have to have made the decision to program the autonomous car to categorize certain objects as "4-year-old child" and other objects as "octogenarian".

Furthermore, someone with a degree in computer science would know how extremely difficult such a task would be.

Whereas recognizing "obstacle" is much easier to program. So the same action would be taken no matter what the obstacle was. And that action should be to stop.


If the passenger wants to take over control of the vehicle at that time then that is an option. But the autonomous car should just stop. And it would do that fast than a human could do that.

A bus with 12 passengers comes up frontally (driven by an imperfect human driver, I guess).

Again, someone with a degree in computer science can tell you how difficult it would be to write a program that could, correctly, determine how many passengers there were in a vehicle.

So, when presented with an obstacle, the autonomous vehicle should stop. And do so faster than a human could.


Now, from a BUSINESS viewpoint the company would be liable for damages should they ship a car that incorrectly identified an obstacle as anything other than an obstacle ("a 4-year-old child", "an octogenarian", "bus with 12 passengers") which resulted in injury or death to the occupants of the autonomous vehicle. Therefore, no company would write such a program.

Whichever the decision, the perfect driverless car becomes a pragmatic killing machine.

You have confused "artificial intelligence" with "autonomous car".

An autonomous car is not the same as an artificial intelligence. Nor would an autonomous car be programmed with the sub-routines that you are postulating.

Comment: Okay .... (Score 1) 435

by khasim (#47469245) Attached to: FBI Concerned About Criminals Using Driverless Cars

As to the two things... what the fuck are you even talking about?

Programming. The car is autonomous because of a computer on-board that runs programs.

And those programs are extremely sophisticated. Which is why it is taking so long to get the programming correct.

In order to "steal" a car you have to be able to re-program it. And if you CAN re-program it then why are you willing to give up a job that will pay $150,000+ to program them for Google?

Are you fucking kidding? Is this a joke?

Okay. How do YOU think an autonomous car works?

Comment: Drug mule? How? (Score 1) 435

by khasim (#47469117) Attached to: FBI Concerned About Criminals Using Driverless Cars

The above drug mule example is excellent!

How? It is legally tied to someone. And it has not been reported stolen. Yet it is travelling X miles, unattended.

Unless you're supposing an intra-city delivery service that would probably look very suspicious. How many legal trips match that?

Now, whether the cops could, legally, search it while it is unoccupied on the highway is an issue that will have to be sorted out. But the cops could always contact the registered owner of the car and ASK to search it.

It is not enough to obey the laws. You also have to appear to belong in a category that the cops are not interested in.

Comment: Correct for the first part. (Score 1) 435

by khasim (#47468907) Attached to: FBI Concerned About Criminals Using Driverless Cars

Physical security is the first rule.

if you don't have it then your system is not secure.

That part is correct.

But that pre-supposes that the CRIMINAL has two things:

1. the skills to reprogram the car AND STILL MAKE IT WORK

2. the desire to become a criminal

If I'm a criminal, I can remote control the car and use it as a surface going drone.

No. You also need the skills to reprogram the car AND STILL MAKE IT WORK.

Those skills are the limiting factor here.

I can go on a car stealing spree and fill a garage with dozens of cars.

Only if you had the skills to reprogram the car AND ....

And then all at once send them out onto the road as wingmen to assist in whatever I want to do.

Only if you had the skills to reprogram ....

They could set up roadblocks all over town... they could ram police cars.

Only if you had the skills ....

And even then you'd have to have a reason for wanting to become a criminal instead of using those same skills to earn $150,000+ a year programming the cars for Google or their competitors.

Almost anything derogatory you could say about today's software design would be accurate. -- K.E. Iverson