Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re:Yes. (Score 1) 631

Nobody is forced to use Unity *yet*, but the alternatives are clearly treated as second class citizens that do not get the same level of attention to detail or integration, and makes for a substandard experience that's increasingly a throwback to the days where Linux on the desktop was *only* for geeks. With Mir on the horizon, and with many developers targeting Ubuntu specifically rather than Linux in general, that situation threatens to get worse, as we could conceivably have a large pool of software with Mir+Unity as hard dependencies very soon.

Comment Sparkleshare, git, and git-annex (Score 1) 238

Sparkleshare (http://sparkleshare.org/) is a "transparent" front end for Git which turns it into a simple file sharing tool. This would probably be appropriate for most of the actual "file sharing" applications the OP mentions (gaining many of the advantages of Git while keeping the complexity hidden until its needed), while obviously any source code fprojects should find their way into some kind of version control repository, probably Git as well, with TortoiseGit (http://code.google.com/p/tortoisegit/) being a fairly compelling solution for a Windows shop.

The learning curve isn't particularly steep here, an hour or less should bring someone up to a functional level with Git, and even though it does have a little trouble working with binaries effectively, particularly large ones, but that's a problem common to most version control systems. git-annex (http://git-annex.branchable.com/) might provide a serviceable workaround for large binary "assets", depending on your workflow, but I haven't used it myself.

Comment Warrant canary. (Score 5, Informative) 397

A more robust version of rsync.net's "warrant canary" (http://www.rsync.net/resources/notices/canary.txt) might help, if it were to become more commonplace, people would start to assume any provider not providing one to already be under gag order.

IANAL, but the legal theory is that while a gag order can make it illegal to speak out, it can't force someone to make falsified or fraudulent statements - any entity that has not already received a secret order is free to testify to that fact, and simply stop making that assertion at such time that they are compromised.

If this were made more robust, for example, key employees being videotaped undergoing a polygraph regularly where they are asked questions about the integrity of their service, it might just work. (I realize a polygraph isn't secure. For this purpose, however, it doesn't matter, because it provides a means to deliberately fail a test while having deniability of your intent to do so.

I'm sure similar creative ideas could be used :)

Comment Re:Forcing strong passwords in the first place. (Score 2) 211

1. Lastpass works across all the platforms you've named, and has it's own sync. Keepass works across all of them, and only needs some form of file sync (eg Dropbox). Firefox sync will get you 4 of the 5 (all except iOS).
2. Virtually all of the circumstances that allow someone to attack the keychain program also tend to permit the undetected installation of a physical or software keylogger. The attacker may not compromise your less frequently used accounts as quickly, but they will have everything you use on a daily basis. (Further, accounts you don't use on a daily basis may be forgotten about, a side benefit of a password manager is a checklist of what needs changed in the event of compromise.)
3. Backup processes apply to password managers, as do password reset processes, and use of a password manager does not preclude use of memorable passphrases for particular accounts, particularly for things like email accounts. Right now, I use passphrases + token codes for email, banking, and my password manager itself, passphrases stored in a password manager for accounts that I have to be able to retype the password (facebook, etc), and completely random passwords at the complexity limit of whatever site I'm registering for if I'm not having to sign on to them from any of my devices (random web forums, etc)., and a shorter, more "traditional" 8 character password for my desktop, where a brute force attack is more likely to be carried out by hand than against the password hash., and ease of typing (muscle memory) is desired.

Comment Re:Forcing strong passwords in the first place. (Score 1) 211

This idea has strong potential, and a way that it can be refined is to offer the user a choice between a random set of password requirements that apply only to them, and change once every few days, and a random passphrase of the xkcd sort. So, you'd have the static rules (at least 16 characters, can't be similar to username, etc), and then you'd add 4 random requirements like:
- The 4th character must be a number.
- The 7th character must by a symbol
- The 2nd character must be an upper case letter.
- The 11th character must be a lower case letter followed by the letter 3 letters before it in the alphabet.
- Enter a password twice below that meets these requirements, or click here to choose the random password the system has chosen for you [fireball yelling slashed baseballs]

Password reuse impossible. Use of a password manager encouraged, and an option is still open for someone who feels the need to memorize. Because the ruleset is random, but can only be switched every few days, the user can't refresh until they find a set that their password is compatible with, and most users will take the easy way out and accept the random passphrase that suddenly looks a lot less scary, but is reasonably secure.

Of course, this is assuming that you actually have a compelling need to even have logins and passwords - if you aren't a financial (banks, credit unions, credit cards, brokerages, etc), healthcare or an email provider, or dealing with accounts for use inside your company, then you probably don't, and should encourage the user of persona or openid instead, rather than furthering the proliferation of accounts that users have to keep up with...

Comment Webconverger (Score 1) 572

Webconverger (http://www.webconverger.com/) is a livecd and USB stick bootable linux distribution for kiosk applications, which also puts it in the same territory as ChromeOS for guest access, only it will work out of the box on a wider range of hardware.

By design, it gives the user a tightly locked down, full screen Firefox browser, and nothing else, but it's somewhat configurable and even supports printing (http://webconverger.org/printing/). Out of the box, it supports the Flash and Google Talk Voice/Video plugins, so most if not all websites will work out of the box, and the user can even do voice calling and Google+ hangouts.

The with the exception of the couple of proprietary browser plugins mentioned above, the software appears to be entirely open source, and they offer a free version, subscription service to customize and manage it for you, or source code if you are comfortable getting your hands dirty. Overall, this looks like one of the easiest ways to provide a safe, controlled environment for your guests, locking them into a browser window where they can do what they want, but nothing will be saved. Given the plethora of cloud apps out there to serve as as substitutes for local apps, with a little creativity, this should be all anyone who doesn't bring their own computer will need.

Comment Re:increasing signal to noise with business triage (Score 2) 360

This is great, and exactly what one organization I once worked for did. We had "business liaison" positions within every department, and "application owners", which were dual roles - these people were generally business users, with extra training so that they could work effectively to bridge between IT and the userbase. As part of these dual roles, they were included in the IT decisionmaking and change control process, so that they knew what was up before it happened, rather than finding out afterwards, and so that they could advocate for their department's needs ("You can't change payroll the day before we run it!" "Tuesday doesn't work because that's year end closing!")

We also filtered everything IT through the helpdesk, from change controls, to access requests, to outage notification & paging, to trouble tickets and support requests. Problems which weren't reproducible were stopped there. Things that seemed to be user education would either be handled by the helpdesk, or assigned to the appropriate business liaison to see if there really was a problem and gather more details. Remote control sessions were utilized by the helpdesk to gather screenshots. Intermittent problems were weeded out unless they recurred, in which case the previous calls were referenced to verify that this really was recurring. We leveraged our engineering and operations teams for troubleshooting when appropriate to gather logs. Only after we had concrete, conclusive details of a problem did something get passed to developers - and when we did, it was handled quickly, because we'd gathered all the information efficiently and correctly.

As a result, the developer teams were able to focus on fixing bugs, not triaging problems. The helpdesk always had ready access to all the relevant teams via phone, email, and instant messaging, and was well respected within them, because we were their filter, firewall, and front line, as well as their secretaries. (And we had the decisionmaking power as far as who got paged at 3am and who didn't!)

Comment Re:Anyone who asks this question should not be in (Score 4, Insightful) 450

I'm going to call BS partly on this. Most of the business world is using basic productivity software, probably Microsoft Office, with some users needing access to an accounting package or CRM. Thin clients aren't so much about up front cost as they are about reducing long term support costs. Using thin clients in an enterprise or small to medium business environment gives you a lot of benefits to the long term bottom line. From a security perspective, you cut the "attack surface" of your network very sharply - from dozens if not hundreds or even thousands of desktops that each need antivirus, security updates, administration, and security monitoring, down to a handful of servers that you can lock down pretty tightly. From a support perspective, you are no longer managing all those desktops, you are now managing a handful of servers. You have all the data for your organization where you can make sure backups are happening, and where you can keep tabs on what data is being stored and where it's stored, so you no longer have to worry about that file with a million customer social security numbers or credit card numbers sitting on someone's desktop, where you won't find out about it until after it walks out the door. Also, with a good setup, you ease the pain of patch days a fair bit, since you don't have to chase breakage across all those desktops, just across the app servers. You remove the expectation of user control because a thin client is clearly not a desktop (the "but I can do it at home, why can't I do it here" syndrome). These are damn good reasons to go to thin clients on the desktop, even if the up front costs are the same or even slightly more, and they apply to most desktop users. Only "high-performance" application demands, like CAD, and software development need fat desktops. Now, on the laptop side of things, internet connections in the field aren't something you can count on, even with mobile broadband and wifi penetration, it's not always there, and it's not always good enough. so thin clients aren't going to make much headway there for a long, long time.

Comment Re:tell ya why, too (Score 1) 766

I would respectfully disagree here. Desktop Linux is a moving target and will be for the foreseeable future. There are too many applications that are considered part of the operating system in the Linux world that have meaningful upgrades within that time frame, upgrades that even for a fairly basic end user are highly desirable, or even mandatory (at least to some users), such as newer browser packages. Highly technical users actually have it easier keeping on an LTS release (even though they are the least likely to do so), because they have the technical know-how to upgrade packages to versions that aren't part of the OS release (either via third-party repositories, repackaging the applications themselves, or via manual installation.) With this in mind, six months really does seem about right on the desktop, especially when you consider that for Ubuntu's regular desktop releases, there's an 18 month (N+2) support cycle in place. This gives enough time to delay upgrading or to even skip one release without losing vendor support. In practical terms, considering that upgrades generally won't happen the day of a new release, the average user will upgrade every 6-14 months - once or twice a year, and the upgrade itself is comparably painless to the processes that exist for Windows - even a major upgrade can be done in place, with the system still usable before, during, and after upgrading.
Privacy

Submission + - What Does DHS Know About You? (philosecurity.org)

Sherri Davidoff writes: "Here's a real copy of an American citizen's DHS Travel Record retrieved from the U.S. Customs and Border Patrol's Automated Targeting System (ATS). This was obtained through a FOIA/Privacy Act request... The document reveals that the DHS is storing the reader's:
  • Credit card number and expiration
  • IP address used to make web travel reservations
  • Hotel information and itinerary
  • Full airline itinerary, including flight numbers and seat numbers
  • Phone numbers, incl. business, home & cell
  • Every frequent flyer and hotel number associated with the subject, even ones not used for the specific reservation
"

Programming

Submission + - The Future of System Administration (standalone-sysadmin.com)

Matt Simmons writes: "System Administration is changing. Where once, we logged into machines to make them work, we've progressed to managing-through-programming, and we're becoming developers in addition to administrators.

This is an interesting layer of abstraction between us and the machines. I've always thought that, regardless of how far the rest of the society was from the cogs of technology, sysadmins would always need to know the underlying mechanisms of how things worked. With the current tools and trends, that's looking less and less like reality. We can automate virtual machines to be created, installed, and configured all by pressing a single button. What happened to the fun of blinkenlights?"

Security

Submission + - Firewall Rulesets Still a Problem? (channelinsider.com)

dasButcher writes: "Security admins used to complain endlessly about the complexity of managing firewall rulesets. But those complaints have diminished as management consoles improved and the firewall has taken become less important in the grand scheme of IT security. But several new products are coming to market to audit and optimize firewall rulesets. As Larry Walsh writes in his blog (http://blogs.channelinsider.com/secure_channel/content/network_security/firewall_ruleset_management_still_an_issue.html), it's not clear whether this is really a problem or the last vestige of the old perimeter firewall. So Walsh asks, "is firewall ruleset management still a problem?""
America Online

Submission + - AOL "This is Spam" link as email DoS?

alabamatoy writes: "AOL email offers its users a clickable link in each email message it delivers to them which is marked "This is Spam". Users who click on this link in an email message cause the sending SMTP server IP address to be added to some kind of AOL internal spam-source blacklist. This causes ALL subsequent email from that server to be blocked to *ALL* AOL users. For small hosting services, this can be a killer. Attempts to convince AOL to identify the user who reported the message as spam have failed (its against their internal privacy policy, they say) so the small hosting service is left with no mechanism to remedy the situation, other than repeatedly trying to convince AOL support that the site really is not a source of spam, and the problem is simply a stupid AOL user. For an entity like (for example) a recreational organization using this small hosting service, email group lists will almost always include one or more AOL users. All that's required to break email connectivity to AOL for ALL customers of the small hosting service is for one AOL user to stupidly click on "this is Spam" button, and all email to AOL grinds to a halt. Does anyone have any insight into how to resolve this problem?"

Comment multiple sound cards and braindead applications (Score 2, Interesting) 427

My chief complaint, both on Windows and Linux is that probably 99% of applications have no concept of anything other than the default sound card, making multiple cards useless for all but a few niche applications. Apps that use sound need to provide a way to specify which device is used in case the user wants to use other than the default, period. None of the solutions for audio so far have really done anything to make this better (or they make it worse in the process) - granted, it's mostly an application issue, but control of device selection in the mixer as well would help.

Comment Re:Vote Verification by Internet (Score 1) 507

This is actually very, very, very bad. The reason we have a secret ballot is to make it difficult to obtain votes by coercion. You should be able to tell for sure at the polling booth how your vote was counted - but only at the moment you are standing there should there be any possibility for a vote to be connected to an individual voter. While this seems far-fetched now, if votes were individually traceable, we'd have far greater problems of election fraud to concern ourselves with - which would include the use of violence to force people to vote a certain way.

Slashdot Top Deals

Even bytes get lonely for a little bit.

Working...