Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment: Community-building instead? (Score 1) 199

by silvermorph (#47672307) Attached to: Ask Slashdot: Should You Invest In Documentation, Or UX?
UX is good, and you have to invest in it no matter what, but it'll never be a silver bullet unless you strip your apps way way way down - something that'll be painful for both you and your established users.

But if your docs really are a bottomless pit, it might behoove you to invest in your community instead of documentation. Grow some in-house experts and put them on the forums and a chat system. Send your users there instead of to increasingly out-of-date help docs and get them in the habit of searching for answers there. Build a reputation for responsiveness to get free customer loyalty on the side. Send your UX people and engineers to the forums as well so they get the pulse of your most frustrated customers. Slowly your community will become your experts as well.

Comment: I want voters to go to college (Score 3, Insightful) 253

by silvermorph (#47412095) Attached to: US Tech Firms Recruiting High Schoolers (And Younger)
I wasn't ecstatic about all the non-major courses I had to take when my primary worry was getting a programming job after I got my degree, and I might have taken an $100K out if it was available. But now 10-15 years later I'm glad I that my formal education included a psychology class, a statistics class, a history class, and others. Maybe I would have picked all that up on my own, or maybe I'd have a giant black hole in my world view.

There's a training side to education and there's a wisdom side to education, and they're both important in the long run. Telling young people to get jobs right out of high school because being well-rounded isn't necessary for "smart" people just means it's going to be a crap shoot as to whether their decisions repeat history or learn from it.

Comment: Re:Solution is Transparency? (Score 1) 156

by silvermorph (#47134835) Attached to: Security Researchers Threatened With US Cybercrime Laws
Cool, that's great, and I don't think you should stop doing that, but you aren't really the case the story is talking about. Although you could be, if the security firm you hire fails to catch all of the vulnerabilities and some white hat somewhere reports something to you. Then it'd be better if you could have some assurance that they were trustworthy.

I don't think we disagree on any specific points so far. I'm not trying to replace security audits, just to encourage people who do the right thing without being paid to do so.

Comment: Re:Solution is Transparency? (Score 1) 156

by silvermorph (#47133751) Attached to: Security Researchers Threatened With US Cybercrime Laws
Determining who to trust is the goal of the system I described, because the only real trust is reciprocal trust. Researchers trust an authority with full record of their activities, and thereby earn the trust of the people they're ostensibly trying to help.

Today this would be done by the owner of the system choosing a security firm to audit their system, but we know that doesn't happen because it's expensive and people are lazy. Still, it needs to be done, so today's researchers just do it without getting permission, which results in vulnerabilities exposed (good), but sometimes also results in lawsuits (bad).

In my proposal, the law defines the terms of that initial agreement, which lets researchers find security flaws without having to get the system's owners' permission.

Comment: Solution is Transparency? (Score 1) 156

by silvermorph (#47131665) Attached to: Security Researchers Threatened With US Cybercrime Laws
Identifying the good guys is a question of trust, so you can imagine why lawmakers are hesitant to throw trust around willy-nilly. Building a system that shows how that trust is reciprocated and enforced would be a good start.

Seems like there could be a law that tries to differentiate "Research Hacking" by setting requirements to qualify as a researcher. They must provide full transparency to prove they have no malicious intent. They inform law enforcement authorities of their activities before and after the exercise and constantly upload logs of their actions and any data transactions they execute. Maybe on a virtual "research sandbox" machine that deletes itself at the end of the session as an added layer of protection. Then if the vulnerability gets out before it's been reported, maybe that researcher (or people with access to their machine) is a good place to start the investigation, so there's incentive to report vulnerabilities quickly. Overly simplistic, probably not quite workable as-stated, but you get the idea.

Comment: Re:Ridiculous (Score 1) 334

/agree about publishing, but not about impracticality. It's not like the police are going to go to everyone's house after a breakup and take their photos away, and those photos will probably exist in the hard drives and minds of millions of people who never make a big deal about it. But it also means that if your ex is holding what amounts to blackmail photos over you, you now have a legal recourse.

Before, if you told the police that you accidentally dated a psycho and now they're showing naked photos of you to everyone in town, they'd say "your ex owns those photos, so maybe you should have kept all your naked pictures to yourself." Which is great if you have a time machine, but not if you're looking to stop someone from being an asshole today. With this, if you make a request, they have legal grounds to take away the photos.

And sure it's probably going to be abused by some people (and that scene from Forgetting Sarah Marshall won't make sense anymore), but before we had people abusing their possession of naked photos. So, which is worse?

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...