Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Solving the wrong problem (Score 1) 294

In a previous life, we passed around virtual machines rather than doing paperwork. Paperwork is to be sure you have a plan to solve the explosion-and-revert problem.Managing machines instead of paper allowed us to include a process for doing an immediate revert on explosion (;-))

The VMs we passed around were Solaris zones, so they were very lightweight. If I wanted to apply an emergency patch to production, I first applied it to an image, put an instance on pre-prod, a physical machine, and varied it into test. After the smoke-test, I varied it into the pool on the load-balancer, and watched it closely. If it fixed the problem and didn't explode, I put lots of instances on the production physical servers and put them into the load-balancer, quiescing the un-patched instances but not erasing them. If the patch blew up after all, I could revert to the previous buggy release as fast as the load-balancer could disconnect people. Not quite as fast as doing an atomic change on a single server, but fast.

This is a minor variant on some old unix norms: 1) you aren't prohibited from doing even silly things, as prohibitions will keep you from doing something brilliant. 2) You can do anything, but you can't hide what you did, 3) you can change things atomically while running, and 4) if you do something dumb, you can revert it immediately.

The process is a variant/predecessor of ITIL, with pre-set apply and revert steps for emergency changes, which are the high-value part of the whole ITIL change process. Non-emergency changes were a little more heavy-weight, as we tested the patch in an instance in QA, then did a simulated UAT overnight (it was automated, but exceedingly slow), reviewed the results and then the de-facto board decided if we could release the image to production, QA and dev. Your paper-oriented CAB does approve all patches to QA and dev, right? I'll bet they missed that part (:-))

--dave
I did once have a customer where I had to do paper-based CAB approvals, but that was because we weren't funded to have a proper dev, and had no QA at all. As you might guess, we still had at least one fiasco. I shortened the contract as much as I could without doing a no-bid in the middle.

Comment Re:It's crap (Score 1) 1633

Please elaborate. On the face of it your response is unconvincing. In a domestic conflict there are going to be a substantial number of the standing military's ranks that will be sympathetic to the Constitution -- the lack of honor by many in the military notwithstanding. How many of them would it take to so debilitate the treasonous government's military that it would be no more effective on US soil than it was on middle eastern soil?

Comment Polishing old code or writing good code (Score 4, Interesting) 139

The report doesn't really go into an important measure.

What is the defect density of the new code that is being added to these projects?

Large projects and old projects in particular will demonstrate good scores in polishing - cleaning out old defects that are present. The new code that is being injected into the project is really where we should be looking... Coverity has the capability to do this, but it doesn't seem to be reported.

Next year it would be very interesting to see the "New code defect density" as a separate metric - currently it is "all code defect density" which may not reflect if Open Source is *producing* better code. The report shows that the collection of *existing* code is getting better each year.

Comment 52 million pictures, >= 2,421 false positives (Score 2) 108

According the wikipedia, the number of pictures being seen as the same with probability p is =sqrt(2d * ln(1/1-p)) If d is 52,000,000 and we use a 99% probability, then for each 21,884.6 pictures we get a false positive with a perfectly accurate matcher. And there are no perfect matchers.

This is a variant of the birthday paradox, where it only takes 100 people to get a 99.9% chance of them having the same birthday, and a mere 23 people to get a 50% chance [wikipedia].

The German Federal Security Service rejected facial matching years ago, for exactly this reason, when I was working for Siemens. The Americans did not, and supposedly stopped someone's grandma for being a (younger, male) terrorist.

If they use this, expect a week or so of everyone's grandma being arrested (;-))

--dave
Mathematicians, please feel free to check me on the numbers: I suspect I'm rather low...

Comment Re:It doesn't. (Score 3, Interesting) 582

This myth gets trotted out again. It is arguably easier to find exploits without source. The source distracts from the discovery of an exploit. The binary simply is. The black-hat is looking for a way to subvert a system. Typically she is not interested in the documented (by source or documentation) functionality. That simply distracts from the issue which is finding out what the software actually does, especially in edge circumstances.

This is what fuzzers do. Typically not aware of the utility of the program, they simply inject tons of junk until something breaks.

Source availability tends to benefit people auditing and repairing more than black-hats.

Yes, it took years for heartbleed to surface. If heartbleed (or a defect like it), was discovered due to a code audit, that speaks to the superiority of open source over closed source. If this defect is found by fuzzing or binary analysis, it is much harder to repair, as users are now at the mercy of the holder of the source. Build a matrix of Open/Closed Source vs. Bug found in Source, Bug by fuzzing/binary analysis.

Bug found in source vs Closed Source is not applicable, giving three element. Found in source vs. Open Source (where the bug will be repaired in the source by anyone). Bug found by fuzzing... where the bug will be repaired in the source by anyone (Open Source) or the Vendor (Closed Source).

The question then is (as I started the article): Is it easier to find bugs by source inspection? Assume big threats will HAVE the source anyway. If it was easy to find by inspection, it would be easy to fix (for examples: OpenBSD continously audits, and security has been a priority at Microsoft for the past decade). Fuzzing and binary analysis is still the preferred (quickest) method, giving the edge to Open Source. The reason is simple -- the black-hat cares about what is actually happening, and not what the source says is happening.

Comment Re:Blame GNOME 3 (Score 2) 693

I have been using Gnome 3.10 (Fedora 20) on an Acer Iconia W700. This has no keyboard when I use it as a tablet. It does have multi-touch, and gyro/magnetic/ambient light/etc sensor.

Tried XFCE (my usual desktop for the past decade) -- it doesn't do well with the 192dpi display. I then decided to try Gnome 3, because of all the complaints (it forces tablet view on users).

- No keyboard means typing to find an application doesn't work. Adding the "Applications Menu" and "Places" Gnome Shell extensions solves this.

- The default on-screen keyboard doesn't support function keys, esc key, control keys. Solution: add florence

- Without a keyboard, yumex is not usable. Can't enter password to activate stuff.

- Can't activate the bottom panel reliably. Using "Frippery bottom panel" helps out (gnome shell extension). Tapping the "!" at the bottom right then does the job. The "Hi, Jack" extension almost works, but isn't reliable enough.

- Rotation doesn't work. I had to put a script on the desktop to activate rotation.

- No multi-touch support in Gnome 3 (really strange, I have a python program that demonstrates multi-touch).

- And now for the cake - Focus is very strange. I can launch a new application but the old application still has some focus! Nasty bug that in interacting with user input.

I would prefer to stay with Fedora. Is there any DE that supports touch better on Fedora? Or do I go with Ubuntu and Unity? Are improvements coming in Gnome 3.12 or 3.14?

Given that your Gnome 3 experience has been much more positive, what is your advice?

Submission + - Civil Liberties Association files class action for all Canadians, against spies (www.slaw.ca)

davecb writes: The British Columbia CLA filed a class action on behalf of all Canadians, against our security services' collecting of metadata, because it allows for a profile to be created of the individuals involved. It's a tough class for a court to certify, but to qualify, the BCCLA needed a class that they knew contained people who were spied upon.

Submission + - Glenn Greenwald and Laura Poitras Return to U.S. Soil (nytimes.com)

rmdingler writes: After remaining abroad since the Snowden revelations broke in June of last year, the two were in New York Friday to accept a Polk Award for national security reporting. Though they cleared customs without a hitch, they are traveling with an ACLU lawyer and a German journalist who are to "document any unpleasant surprises." According to Ms. Poitras, the risks of subpoena are very real.

What, if anything, do you expect the American government to do considering Snowden's case has been officially cited as violating the Espionage Act? nytimes

Submission + - Apple's Spotty Record Of Giving Back To The Tech Industry (itworld.com)

chicksdaddy writes: One of the meta-stories to come out of the Heartbleed (http://heartbleed.com/) debacle is the degree to which large and wealthy companies have come to rely on third party code (http://blog.veracode.com/2014/04/heartbleed-and-the-curse-of-third-party-code/) — specifically, open source software maintained by volunteers on a shoestring budget. Adding insult to injury is the phenomenon of large, incredibly wealthy companies that gladly pick the fruit of open source software, but refusing to peel off a tiny fraction of their profits to financially support those same groups.

Exhibit 1: Apple Computer. On Friday, IT World ran a story that looks at Apple's long history of not giving back to the technology and open source community. The article cites three glaring examples: Apple's non-support of the Apache Software Foundation (despite bundling Apache with OS X), as well as its non-support of OASIS and refusal to participate in the Trusted Computing Group (despite leveraging TCG-inspired concepts, like AMDs Secure Enclave in iPhone 5s).

Given Apple's status as the world's most valuable company and its enormous cash hoard, the refusal to offer even meager support to open source and industry groups is puzzling. From the article:

"Apple bundles software from the Apache Software Foundation with its OS X operating system, but does not financially support the Apache Software Foundation (ASF) in any way. That is in contrast to Google and Microsoft, Apple's two chief competitors, which are both Platinum sponsors of ASF — signifying a contribution of $100,000 annually to the Foundation. Sponsorships range as low as $5,000 a year (Bronze), said Sally Khudairi, ASF's Director of Marketing and Public Relations. The ASF is vendor-neutral and all code contributions to the Foundation are done on an individual basis. Apple employees are frequent, individual contributors to Apache. However, their employer is not, Khudairi noted.

The company has been a sponsor of ApacheCon, a for-profit conference that runs separately from the Foundation — but not in the last 10 years. "We were told they didn't have the budget," she said of efforts to get Apple's support for ApacheCon in 2004, a year in which the company reported net income of $276 million on revenue of $8.28 billion."

Carol Geyer at OASIS is quoted saying her organization has done "lots of outreach" to Apple and other firms over the years, and regularly contacts Apple about becoming a member. "Whenever we're spinning up a new working group where we think they could contribute we will reach out and encourage them to join," she said. But those communications always go in one direction, Geyer said, with Apple declining the entreaties.

Today, the company has no presence on any of the Organization's 100-odd active committees, which are developing cross-industry technology standards such as The Key Management Interoperability Protocol (KMIP) and the Public-Key Cryptography Standard (PKCS).

Submission + - Do backups on Linux no longer matter? (sourceforge.net) 5

cogcritter writes: In June of 2009, the dump/restore utilities version 0.4b42 for Linux's ext3 filesystem were released. This was the last version where incremental dumps could actually be used. A bug introduced in 0.4b43, one year later, causes restore to fail when processing an incremental backup unless, basically, no directory deletions occurred since the level 0 part of the backup set was taken.

The bug is certainly present in Debian Wheezy, and comments in Debian's defect tracking system suggest that the bug has permeated out into other distros as well.

How can Linux's backup/restore tools for its popular ext2/ext3 filesystem be broken for 3+ years, and nobody seems to care? Does nobody take backups? Or do they not use incremental backups? How many people are going to find themselves scrambling when they next NEED to restore a filesystem, and find themselves in possession of long-broken tools?

Just in case this article is where some hapless sysadmin ends up, the workaround is to go to dump.sf.net, go to the files section, pull down the 0.4b42 version and build it for yourself. For me, I think going forward I'm going to switch to filesystem mirroring using rsync.

Submission + - The Comcast merger isn't about lines on a map,it's about controlling information (consumerist.com)

An anonymous reader writes: Comcast and proposed merger partner Time Warner Cable claim they donâ(TM)t compete because their service areas donâ(TM)t overlap, and that a combined company would happily divest itself of a few million customers to keeps its pay-TV market share below 30%, allowing other companies that donâ(TM)t currently compete with Comcast to keep not competing with Comcast. This narrow, shortsighted view fails to take into account the full breadth of whatâ(TM)s involved in this merger â" broadcast TV, cable TV, network technology, in-home technology, access to the Internet, and much more. In addition to asking whether or not regulators should permit Comcast to add 10-12 million customers, there is a more important question at the core of this deal: Should Comcast be allowed to control both what content you consume and how you get to consume it?

Slashdot Top Deals

What this country needs is a good five dollar plasma weapon.

Working...