Forgot your password?
typodupeerror

Comment: The reason the government wants this... (Score 3, Informative) 253

by sigmabody (#47832053) Attached to: UCLA, CIsco & More Launch Consortium To Replace TCP/IP

For those who don't see why this is bad, consider this:

In order to route/cache by data, the data must be visible to the routing nodes; in essence, you would no longer be able to use end-to-end encryption. You could still have point-to-point (eg: encryption for wireless connections), but everything would be visible to routing nodes, by necessity. This means no more hiding communications from the government (who taps all the backbone routers), no TOR routing, no protection from MTM attacks, by design. You get the promise of more efficiency, at the cost of your privacy/freedom... and guess what, you'll get neither in this case, too.

Comment: Data point (Score 1) 348

I don't run a local firewall on my work system, for reference. As a developer, it's common to need to have "random" ports open for various things for testing, and having to deal with a firewall is one more nuisance I don't want to account for. A local (on system) firewall won't prevent most attacks anyway, so I don't feel I'm giving up much real security.

I do run a local firewall at home, but only because it has not annoyed me enough to be disabled yet.

I don't know how useful that information is; consider it a data point.

Comment: Half measure... (Score 1) 178

It's a good PR attempt, to address what they must perceive as a significant problem, but...

Good luck convincing companies to trust your cloud infrastructure with their data, when they know for a fact that the US government (and probably other governments) could compel you to grant them secret access at any time, regardless of whatever client-access protections are in place. If MS could solve that massive security flaw, I'd be impressed; anything less is just polishing the proverbial turd.

Comment: Google needs to get ahead of this... (Score 1) 248

Google's only really viable option, as far as I can tell, is to create a tailored censored portal for each country (really, legal jurisdiction, but basically the same thing), and allow anyone in that jurisdiction to request that anything be censored in an automated manner. Then they can create an "uncensored" jurisdiction, which you would need to opt into, with a disclaimer and such.

Once you have that, you can much more effectively fight these sort of "censor for the entire world" orders, by asserting that you already support per-jurisdiction "removal", and to remove globally would violate the rights of other jurisdictions to self-censor as appropriate. It's not perfect (nothing in international law is), but at least it would give Google a way to somewhat comply with the flood of censorship demands which are coming, without trying to fight each new demand independently.

Government

US Pushing Local Police To Keep Quiet On Cell-Phone Surveillance Technology 253

Posted by timothy
from the all-you-debaters-are-welcome dept.
schwit1 (797399) writes with this story from the Associated Press, as carried by Yahoo News: The Obama administration has been quietly advising local police not to disclose details about surveillance technology they are using to sweep up basic cellphone data from entire neighborhoods, The Associated Press has learned. Citing security reasons, the U.S. has intervened in routine state public records cases and criminal trials regarding use of the technology. This has resulted in police departments withholding materials or heavily censoring documents in rare instances when they disclose any about the purchase and use of such powerful surveillance equipment. Federal involvement in local open records proceedings is unusual. It comes at a time when President Barack Obama has said he welcomes a debate on government surveillance and called for more transparency about spying in the wake of disclosures about classified federal surveillance programs.

Comment: Could be a good thing (Score 1) 249

by sigmabody (#47219985) Attached to: New Permission System Could Make Android Much Less Secure

This could turn out to be a good thing, imho.

Consider that there are basically two types of users, where privacy is concerned: people who are oblivious and/or don't care about their privacy, and people who try to preserve some of their privacy. For the former group, this change will not affect their app usage, and will make it easier for them to get app updates automatically, which will make their experience better. For the latter group, the Android developers are actively hostile toward your privacy desires, have no desire to help you, and in fact probably _want_ to drive you away from the platform. In both cases, it's a win for Android, the "all your data belongs to us and everyone else, and there isn't anything you can do about it" platform.

I personally think there's a market for platforms which allow some privacy (Apple does a much better, but still imperfect, job of this), but I acknowledge that there's also a market (and probably a larger one) for platforms which cater to people who share all their personal data with everyone, and are totally oblivious to what any/all of their apps are doing behind their backs. Google is making it crystal clear which type of platform Android, and their other services (see also: Nearby), will be.

Comment: Re:Good news, actually (Score 1) 600

by sigmabody (#46819747) Attached to: The US Public's Erratic Acceptance of Science

Questionable, in the sense that the theory is a very speculative extrapolation of the data we have been able to observe, about the origins of the universe before the "time" we can actually observe. Just because something fits a mathematical model doesn't mean we have solid evidence for it; it simply means it's a model which matches what we've been able to [indirectly] observe. You could say the same thing about n-dimensional string theory as a unified model, for example.

Comment: Good news, actually (Score 1) 600

by sigmabody (#46819693) Attached to: The US Public's Erratic Acceptance of Science

It's gratifying to see that the public's general acceptance of scientific theories is roughly proportional to the actual evidence to support the theories themselves. For things which there is good evidence, there is broad understanding; for things which are highly questionable and politicized, there is much skepticism.

Good for the US population. :)

Comment: Interesting conceptual argument (Score 1) 235

by sigmabody (#46793305) Attached to: Bug Bounties Don't Help If Bugs Never Run Out

It is an interesting conceptual argument, although it ignores a couple a real-world points.

First, not all bugs are equal, in terms of exploitation opportunity, as he's glossing over; the vulnerability is only as valuable as what it can be exploited to allow access to, in monetary exploitation terms. A bug in something which cannot be exploited for any particular gain is next to worthless, in market terms.

Second, not all companies will pay for vulnerability information, because it's not just a value proposition, but also a risk and resources assessment. If nobody expects your software to be "secure", there's no point is spending too much money on software security; for example, nobody pays much attention to the software in cars (yet), so manufacturers have little financial incentive to make it secure. Moreover, if you don't have deep pockets, you're not going to pay for exploits, especially if you're struggling to simply produce features that potential customers want. In either of those scenarios, the value proposition for paying for exploits is inconsequential.

Most (by volume) software has an effectively unlimited amount of bugs, which nobody will pay for. That's the real world of software.

Comment: Re:This could start a precedent... or some lawsuit (Score 1) 236

by sigmabody (#46734433) Attached to: GM Names Names, Suspends Two Engineers Over Ignition-Switch Safety

Well, speaking as a [software] engineer...

In my profession, there are certainly certifications one can get, and ethical considerations (as a general statement), although there is no particular licensing. Regardless of these, though, I am employed to write software, but I would not certify that the software I write is flaw-free (nor would anyone else that I know). It's entirely possible that, due to flaws in my work product, someone will lose money, or have other negative outcomes befall them.

If that happened, and my employer blamed me publicly (explicitly or implicitly), I would be seeking large monetary damages, even if the flaw was my fault. My argument would be that I'm employed to write software, not write flaw-free software, and if the company causes me damages (in current or future income) by stating or implying that I did not perform by work duties appropriately, then that is slander, and they are liable. In this case, the "lie" would be to imply that my work product was supposed to be flaw-free, which I never asserted or consented to, regardless of what they desired. Implying that someone is unable to perform one's occupation is textbook slander, and the company would find themselves writing a large check. And yes, even naming the engineer in this context, without strong evidence of gross or malicious negligence, would be cause for civil penalty (imho).

I guess it just comes down to this: there are laws which protect people from having their lives and/or livelihood ruined by false accusation (direct or implied), and implying that an engineer must create a flaw-free work product to be proficient is a false accusation (unless there's a specific contractual obligation to do so, and that would seem suspicious). If I were a company considering this, I'd think twice, and then not expose myself to the obvious liability.

Comment: This could start a precedent... or some lawsuits (Score 3, Interesting) 236

by sigmabody (#46730959) Attached to: GM Names Names, Suspends Two Engineers Over Ignition-Switch Safety

I could see two potential outcomes, if blaming engineers for product flaws becomes commonplace...

First, engineers will (or should) demand an indemnity clause as part of their employment contract, where the company agrees not to blame them publicly for any product flaws, and/or take any action which would identify them. Depending on the repercussions for the test cases, this might become a necessity for employees.

Second, I could see some significant lawsuits for slander, since the company is causing real (and substantial, and more importantly provable) financial loss for the engineers they blame for product deficiencies. Unless they have a pretty solid intentional negligence defense, they could (and absolutely should) find themselves paying out a few million more to each engineer they throw under the metaphorical bus.

Companies are responsible for their products, not the people they employ to make/provide them. Companies reap the rewards when they work, and bear the responsibility when they don't. Absent malicious negligence, naming/blaming individual employees is irresponsible at best, and should absolutely expose the company to civil liability.

Comment: Another alternative strategy (Score 1) 189

by sigmabody (#46234297) Attached to: Microsoft Rumored To Integrate Android Apps

This may seem infeasible and/or culture-prohibitive, but there's another way Microsoft could go, which could see them gaining market share in mobile, and perhaps even surpassing google/Apple eventually.

Instead of trying to figure out how to optimally leverage the technology and assets they currently have (Nokia, Office, Windows Phone, patents, etc.) to optimize their own profit, as they have been doing for the last decade or so, they could try something new: building something which actual customers want. I know it's somewhat unheard of in the age of big companies, patent portfolios, and quarterly reports, but anyone who think there's not substantial room for innovation in virtually all aspects of the mobile space is simply not trying to think.

Microsoft had (and still has remnants of) the technical might to pursue multiple avenues of innovation at the same time. If they could simply change focus away from brow-beating their reluctant customers with their latest profit-optimized business plan, and toward giving customers what they actually want (here's a free hint: a phone where you are in control of where your data is, and what apps can access it), they could still do quite well. If they continue their current business mindset, though, it won't matter what they pick to focus on: eventually, they will be toast.

Comment: Ah, Dianne... (Score 2) 510

by sigmabody (#46014215) Attached to: Senator Dianne Feinstein: NSA Metadata Program Here To Stay

That's what I "love" about my "representative": never afraid to state the blindingly obvious, while completely and derisively ignoring the will of the people she nominally speaks for. Of course the government is not going to willingly give up their police-state surveillance powers; governments never give up power they have taken, legally or otherwise. Blah blah, security, protection, something about terrorism, etc.

Comment: No effect... (Score 1) 288

by sigmabody (#45666241) Attached to: A Year After Ban On Loud TV Commercials: Has It Worked?

Just to point at one example: Elementary. The commercial volume is consistently MUCH higher than the show volume, which itself fluctuates enough during the show to make it annoying to watch. If the FCC really wanted results, they could just have some automated application "listening" to programs, and fining broadcasters automatically, rather than judging effectiveness based on quantity of people with enough time to waste to go through their complaint process. Based on how easy that is, I'd say they have no desire to actually help anyone.

"Pull the trigger and you're garbage." -- Lady Blue

Working...