Forgot your password?
typodupeerror

Comment: This is your password deal with it. (Score 1) 162

by caitriona81 (#46457157) Attached to: Top E-commerce Sites Fail To Protect Users From Stupid Passwords

I think the right strategy for websites which have to do user registration is to just provide the user with a random password of sufficient length as to be near impossible to type correctly, much less remember, and don't even provide the functionality for users to select their own. This almost insures that the password won't be used elsewhere, it enforces password quality, and it encourages the use of a good password manager.

Bitcoin

Porn Will Be Bitcoin's Killer App 216

Posted by timothy
from the we-call-this-the-duh-factor dept.
An anonymous reader writes "In December, porn.com started accepting Bitcoin for its premium services, and the virtual currency quickly came to account for 10 percent of sales. At the start of January, a post on Reddit's Bitcoin subforum boosted the figure to 50 percent, before settling down to about 25 percent. The tremendous interest has led David Kay, the marketing director at porn.com's parent company Sagan, to talk very positively about the virtual currency: 'I definitely believe that porn will be Bitcoin's killer app,' he told The Guardian. 'Fast, private and confidential payments.'"

Comment: Re:Yes. (Score 1) 631

by caitriona81 (#44945975) Attached to: Ask Slashdot: Are We Witnessing the Decline of Ubuntu?

Nobody is forced to use Unity *yet*, but the alternatives are clearly treated as second class citizens that do not get the same level of attention to detail or integration, and makes for a substandard experience that's increasingly a throwback to the days where Linux on the desktop was *only* for geeks. With Mir on the horizon, and with many developers targeting Ubuntu specifically rather than Linux in general, that situation threatens to get worse, as we could conceivably have a large pool of software with Mir+Unity as hard dependencies very soon.

Comment: Sparkleshare, git, and git-annex (Score 1) 238

Sparkleshare (http://sparkleshare.org/) is a "transparent" front end for Git which turns it into a simple file sharing tool. This would probably be appropriate for most of the actual "file sharing" applications the OP mentions (gaining many of the advantages of Git while keeping the complexity hidden until its needed), while obviously any source code fprojects should find their way into some kind of version control repository, probably Git as well, with TortoiseGit (http://code.google.com/p/tortoisegit/) being a fairly compelling solution for a Windows shop.

The learning curve isn't particularly steep here, an hour or less should bring someone up to a functional level with Git, and even though it does have a little trouble working with binaries effectively, particularly large ones, but that's a problem common to most version control systems. git-annex (http://git-annex.branchable.com/) might provide a serviceable workaround for large binary "assets", depending on your workflow, but I haven't used it myself.

Comment: Warrant canary. (Score 5, Informative) 397

A more robust version of rsync.net's "warrant canary" (http://www.rsync.net/resources/notices/canary.txt) might help, if it were to become more commonplace, people would start to assume any provider not providing one to already be under gag order.

IANAL, but the legal theory is that while a gag order can make it illegal to speak out, it can't force someone to make falsified or fraudulent statements - any entity that has not already received a secret order is free to testify to that fact, and simply stop making that assertion at such time that they are compromised.

If this were made more robust, for example, key employees being videotaped undergoing a polygraph regularly where they are asked questions about the integrity of their service, it might just work. (I realize a polygraph isn't secure. For this purpose, however, it doesn't matter, because it provides a means to deliberately fail a test while having deniability of your intent to do so.

I'm sure similar creative ideas could be used :)

Japan

How To Monitor Leaky Radioactive Water Tanks 111

Posted by timothy
from the for-fun-and-profit dept.
freaklabs writes "The radioactive water leaks are getting worse at Fukushima Dai-Ichi. In a recent New York Times article, it was mentioned that TEPCO didn't have a reliable way to monitor the water storage tanks for leaks. I decided to write a tutorial on how to wirelessly monitor water levels in storage tanks."

Comment: Re:Forcing strong passwords in the first place. (Score 2) 211

by caitriona81 (#43575883) Attached to: Mitigating Password Re-Use From the Other End

1. Lastpass works across all the platforms you've named, and has it's own sync. Keepass works across all of them, and only needs some form of file sync (eg Dropbox). Firefox sync will get you 4 of the 5 (all except iOS).
2. Virtually all of the circumstances that allow someone to attack the keychain program also tend to permit the undetected installation of a physical or software keylogger. The attacker may not compromise your less frequently used accounts as quickly, but they will have everything you use on a daily basis. (Further, accounts you don't use on a daily basis may be forgotten about, a side benefit of a password manager is a checklist of what needs changed in the event of compromise.)
3. Backup processes apply to password managers, as do password reset processes, and use of a password manager does not preclude use of memorable passphrases for particular accounts, particularly for things like email accounts. Right now, I use passphrases + token codes for email, banking, and my password manager itself, passphrases stored in a password manager for accounts that I have to be able to retype the password (facebook, etc), and completely random passwords at the complexity limit of whatever site I'm registering for if I'm not having to sign on to them from any of my devices (random web forums, etc)., and a shorter, more "traditional" 8 character password for my desktop, where a brute force attack is more likely to be carried out by hand than against the password hash., and ease of typing (muscle memory) is desired.

Comment: Re:Forcing strong passwords in the first place. (Score 1) 211

by caitriona81 (#43575755) Attached to: Mitigating Password Re-Use From the Other End

This idea has strong potential, and a way that it can be refined is to offer the user a choice between a random set of password requirements that apply only to them, and change once every few days, and a random passphrase of the xkcd sort. So, you'd have the static rules (at least 16 characters, can't be similar to username, etc), and then you'd add 4 random requirements like:
- The 4th character must be a number.
- The 7th character must by a symbol
- The 2nd character must be an upper case letter.
- The 11th character must be a lower case letter followed by the letter 3 letters before it in the alphabet.
- Enter a password twice below that meets these requirements, or click here to choose the random password the system has chosen for you [fireball yelling slashed baseballs]

Password reuse impossible. Use of a password manager encouraged, and an option is still open for someone who feels the need to memorize. Because the ruleset is random, but can only be switched every few days, the user can't refresh until they find a set that their password is compatible with, and most users will take the easy way out and accept the random passphrase that suddenly looks a lot less scary, but is reasonably secure.

Of course, this is assuming that you actually have a compelling need to even have logins and passwords - if you aren't a financial (banks, credit unions, credit cards, brokerages, etc), healthcare or an email provider, or dealing with accounts for use inside your company, then you probably don't, and should encourage the user of persona or openid instead, rather than furthering the proliferation of accounts that users have to keep up with...

Comment: Webconverger (Score 1) 572

by caitriona81 (#43365655) Attached to: Ask Slashdot: Protecting Home Computers From Guests?

Webconverger (http://www.webconverger.com/) is a livecd and USB stick bootable linux distribution for kiosk applications, which also puts it in the same territory as ChromeOS for guest access, only it will work out of the box on a wider range of hardware.

By design, it gives the user a tightly locked down, full screen Firefox browser, and nothing else, but it's somewhat configurable and even supports printing (http://webconverger.org/printing/). Out of the box, it supports the Flash and Google Talk Voice/Video plugins, so most if not all websites will work out of the box, and the user can even do voice calling and Google+ hangouts.

The with the exception of the couple of proprietary browser plugins mentioned above, the software appears to be entirely open source, and they offer a free version, subscription service to customize and manage it for you, or source code if you are comfortable getting your hands dirty. Overall, this looks like one of the easiest ways to provide a safe, controlled environment for your guests, locking them into a browser window where they can do what they want, but nothing will be saved. Given the plethora of cloud apps out there to serve as as substitutes for local apps, with a little creativity, this should be all anyone who doesn't bring their own computer will need.

Privacy

Supreme Court Rules Warrants Needed for GPS Monitoring 354

Posted by samzenpus
from the get-your-paperwork-in-order dept.
gambit3 writes "The Supreme Court has issued its ruling in the case of Washington, D.C. nightclub owner Antoine Jones, saying police must get a search warrant before using GPS technology to track criminal suspects. A federal appeals court in Washington overturned his drug conspiracy conviction because police did not have a warrant when they installed a GPS device on his vehicle and then tracked his movements for a month."

Comment: Re:increasing signal to noise with business triage (Score 2) 360

This is great, and exactly what one organization I once worked for did. We had "business liaison" positions within every department, and "application owners", which were dual roles - these people were generally business users, with extra training so that they could work effectively to bridge between IT and the userbase. As part of these dual roles, they were included in the IT decisionmaking and change control process, so that they knew what was up before it happened, rather than finding out afterwards, and so that they could advocate for their department's needs ("You can't change payroll the day before we run it!" "Tuesday doesn't work because that's year end closing!")

We also filtered everything IT through the helpdesk, from change controls, to access requests, to outage notification & paging, to trouble tickets and support requests. Problems which weren't reproducible were stopped there. Things that seemed to be user education would either be handled by the helpdesk, or assigned to the appropriate business liaison to see if there really was a problem and gather more details. Remote control sessions were utilized by the helpdesk to gather screenshots. Intermittent problems were weeded out unless they recurred, in which case the previous calls were referenced to verify that this really was recurring. We leveraged our engineering and operations teams for troubleshooting when appropriate to gather logs. Only after we had concrete, conclusive details of a problem did something get passed to developers - and when we did, it was handled quickly, because we'd gathered all the information efficiently and correctly.

As a result, the developer teams were able to focus on fixing bugs, not triaging problems. The helpdesk always had ready access to all the relevant teams via phone, email, and instant messaging, and was well respected within them, because we were their filter, firewall, and front line, as well as their secretaries. (And we had the decisionmaking power as far as who got paged at 3am and who didn't!)

Comment: Re:Anyone who asks this question should not be in (Score 4, Insightful) 450

by caitriona81 (#34693470) Attached to: Thin Client, Or Fat Client? That Is the Question
I'm going to call BS partly on this. Most of the business world is using basic productivity software, probably Microsoft Office, with some users needing access to an accounting package or CRM. Thin clients aren't so much about up front cost as they are about reducing long term support costs. Using thin clients in an enterprise or small to medium business environment gives you a lot of benefits to the long term bottom line. From a security perspective, you cut the "attack surface" of your network very sharply - from dozens if not hundreds or even thousands of desktops that each need antivirus, security updates, administration, and security monitoring, down to a handful of servers that you can lock down pretty tightly. From a support perspective, you are no longer managing all those desktops, you are now managing a handful of servers. You have all the data for your organization where you can make sure backups are happening, and where you can keep tabs on what data is being stored and where it's stored, so you no longer have to worry about that file with a million customer social security numbers or credit card numbers sitting on someone's desktop, where you won't find out about it until after it walks out the door. Also, with a good setup, you ease the pain of patch days a fair bit, since you don't have to chase breakage across all those desktops, just across the app servers. You remove the expectation of user control because a thin client is clearly not a desktop (the "but I can do it at home, why can't I do it here" syndrome). These are damn good reasons to go to thin clients on the desktop, even if the up front costs are the same or even slightly more, and they apply to most desktop users. Only "high-performance" application demands, like CAD, and software development need fat desktops. Now, on the laptop side of things, internet connections in the field aren't something you can count on, even with mobile broadband and wifi penetration, it's not always there, and it's not always good enough. so thin clients aren't going to make much headway there for a long, long time.
Movies

The Lost Film That Accompanied Empire Strikes Back 195

Posted by timothy
from the first-reel dept.
An anonymous reader writes "'Alien' and 'Star Wars' art director Roger Christian was given £25,000 by George Lucas in 1979 to make a 25-minute medieval B-feature called 'Black Angel.' This spiritual tale of a knight on a strange quest was inspired by Christian's near-fatal fever when he fell ill in Mexico making 'Lucky Lady.' 'Black Angel' made a huge impression, not least because it shared the dark tone of 'Empire Strikes Back.' John Boorman showed it to the crew of 'Excalibur' as a template for how he wanted his film to look, and 'Black Angel' went on to influence films such as 'Dragonslayer' and 'Legend' throughout the 1980s and beyond. But it has not been seen by anyone since 'Empire' finished its theatrical run. Two weeks ago Roger Christian unearthed a print of a film that was thought lost forever, and in this interview he talks about 'Black Angel,' and provides the only picture from the film that has ever hit the Internet."

Comment: Re:tell ya why, too (Score 1) 766

by caitriona81 (#31217622) Attached to: Which Linux For Non-Techie Windows Users?
I would respectfully disagree here. Desktop Linux is a moving target and will be for the foreseeable future. There are too many applications that are considered part of the operating system in the Linux world that have meaningful upgrades within that time frame, upgrades that even for a fairly basic end user are highly desirable, or even mandatory (at least to some users), such as newer browser packages. Highly technical users actually have it easier keeping on an LTS release (even though they are the least likely to do so), because they have the technical know-how to upgrade packages to versions that aren't part of the OS release (either via third-party repositories, repackaging the applications themselves, or via manual installation.) With this in mind, six months really does seem about right on the desktop, especially when you consider that for Ubuntu's regular desktop releases, there's an 18 month (N+2) support cycle in place. This gives enough time to delay upgrading or to even skip one release without losing vendor support. In practical terms, considering that upgrades generally won't happen the day of a new release, the average user will upgrade every 6-14 months - once or twice a year, and the upgrade itself is comparably painless to the processes that exist for Windows - even a major upgrade can be done in place, with the system still usable before, during, and after upgrading.

As of next Thursday, UNIX will be flushed in favor of TOPS-10. Please update your programs.

Working...