Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Take it public (Score 5, Interesting) 266

Basically all he did is say "I posted to someone's timeline, this is a bug" and linked to the post he made. He didn't explain anything.

Bzzt. If Facebook's logging weren't broken, that should be all they need. The existence of the post itself, having been posted to a wall where he should not have been allowed to post, should have been enough to determine trivially that the bug was real. Further, the post's database record should contain the posting IP address and the ID of the server that handled the request. From there, they should have been able to look at the server's request logs to determine precisely how the attack happened (assuming the researcher was using a structurally valid URL in the request, as opposed to exploiting a null character handling bug in the web server itself).

But even if they looked at the logs and couldn't figure out what happened, IMO, it is still completely unacceptable to just close a bug like this. It's one of those bugs that, if real, is borderline catastrophic in scope. You do not close a bug like that as "cannot reproduce". You contact the originator and say, "Hey, can we get more information about this? We need to try to reproduce the problem."

It's sad that it takes somebody posting on the CEO's Facebook page to get the attention of Facebook's security staff. This means one of two things: they are grossly mismanaged or are woefully understaffed—probably the latter, IMO. Either way, it tells me that Facebook does not take security seriously enough. If bug screeners do not have time to properly follow up on bugs that are this severe, then they need to double or even triple the number of screeners.

Also, this brings into serious question the way that Facebook screens bugs in the first place. Where I work, a bug like this would have been tagged as a security bug the moment it came in. This causes additional people to review the bug, significantly reducing the likelihood of a serious mistake. Closing the bug without asking for more information strongly suggests that a single, hopelessly overworked individual made a mistake, and that the company as a whole failed to have proper processes in place to ensure additional review that would otherwise have caught that mistake quickly and followed up with the original reporter. Not good. Not good at all.

And as long as I'm criticizing Facebook's security practices, IMO, a service like this should have several publicly visible, official security testing accounts for precisely this purpose, with various restrictions on various posts, etc. so that security researchers can properly hammer on their site's security. For example, there should be an official test account that looks an awful lot like Mark Zuckerberg's account. If a researcher is able to post on the wall of that account, there can be no doubt whatsoever about the fact that a bug exists. Likewise, there should be more complex accounts with various security settings, complete with a list of that content and the expected behavior (e.g. you should not be able to read the barcode image entitled "nude_selfie_for_my_boyfriend.jpg").

In short, I suspect there's plenty of blame to go around for this error. What matters is not who gets blamed, but rather how Facebook fixes their processes to ensure that such mistakes do not get made in the future. And I would emphasize that this does not involve firing anyone. People make mistakes. That's why processes are supposed to be designed to mitigate those mistakes. A company like Facebook is big enough that they should know this. If they don't, then perhaps this object lesson will get their attention and cause them to change their ways. If not, it's time to run, not walk, to a competing service.

Either way, what the researcher did was IMO wholly appropriate. He initially performed the smallest attack that could potentially have proven that there was a flaw. When the first report was casually dismissed, he then escalated that attack, but only to the minimum degree necessary to prove beyond any reasonable doubt that there was a flaw (by attacking a single, prominent account belonging to a readily identifiable Facebook employee). Had Facebook provided a "Zark Muckerberg" account as suggested earlier, he could have used that. They didn't, so he used the only remaining tool that was available—the CEO's real account.

Would it have been better if he had included steps to reproduce in the original bug? Sure. Is Facebook behaving like a spoiled child after getting called out for misbehavior? You bet. Does the researcher deserve the bounty? Uh, duh. More to the point, this guy deserves a job. But at the very least, he deserves a big bounty for uncovering not just a security bug, but also a serious process problem that allowed such a serious bug to be inadvertently swept under the rug. And that, IMO, is even more significant than the bug itself.

Comment Re:Take it public (Score 4, Insightful) 266

No, not almost invariably. Invariably. You always follow up on security hole bug reports. Always. If you do not do this, you are incompetent. Assuming this security researcher gave them a reasonable amount of time (the summary here doesn't say), then this is once again a demonstration of Facebook talking "secure" but implementing the opposite, hyping their bounty program while refusing to pay out.

For that matter, you should always follow up on non-security bug reports unless they're obvious garbage (e.g. porn site spam submitted to your bug reporting page by a bot). But security bugs? There's no excuse for not following up on those. Ever. EVER.

Comment Re:This will be Godwinned (Score 1) 496

But that was a different world. 9/11 changed everything, man.

Seriously, as cynical as it sounds, at least in the U.S., if the Nuremberg trials were conducted today, we'd probably let them off for "just following orders". Chilling, isn't it, how quickly we have become the enemy, a mirror reflecting that which our forefathers died fighting against?

"What truth is there, but the law?" they say. "Crucify him! Crucify him!"

This is what it sounds like when justice ends and the blind and arbitrary pursuit of revenge begins. This is always what it sounds like.

Comment Re:LOL. (Score 1) 496

Lets see. Did he give them aid by releasing all sorts of information that used to recruit other muslims to be terrorists? Yup.
Was that information used against America to back attacks on America? Yup.

So by your definition, the news media, every time a reporter factually reports a government bomb going astray and blowing up a bunch of children, commits treason? What a disturbingly paranoid world you must live in, worrying about every minor action you might commit being a crime.

Giving aid to enemies in a time of war refers to stealing secrets that are tactically valuable to the enemy, not revealing evidence of crimes that are psychologically valuable to the enemy—not revealing secrets that are only secret because nobody else happened to be watching at that particular moment. The Constitution's prohibition against treason was never intended to give protection to war crimes.

Comment Re:Practical (Score 2) 127

The jetpack should provide more than enough thrust for a paramedic, a small bottle of oxygen, a handful of epi pens, and a portable AED. That rapid first response could be quite valuable in terms of stabilizing the patient in many cases even if it does still take a few extra minutes for the ambulance to arrive.

That said, even in spite of that, the entire concept still borders on insane. :-)

Comment Re:CEOs are overrated (Score 1) 692

Sculley, who gave us the first color Mac, who resisted the Mac OS licensing that nearly killed Apple, etc. Yes, he screwed up when he introduced each configuration of the Performa as its own model, but most of the real damage happened after he left and Spindler took over.

Slashdot Top Deals

Mathematicians practice absolute freedom. -- Henry Adams

Working...