Follow Slashdot stories on Twitter


Forgot your password?
Slashdot Deals: Deal of the Day - 6 month subscription of Pandora One at 46% off. ×

Comment Full-time permanent is required (Score 5, Informative) 159

Having been on the permanent-staff team dealing with contract workers, I can't see permanent staff ever being replaced by "gig" developers. A lot of things depend on having not just skill in programming but familiarity with the business and prior decisions about the system's design and architecture. You can hire short-term people for specific tasks, but you need people who've been there long-term to work out how to fit new requirements into the system as it exists. Then there's maintenance. Bugs that make it into production tend to be obscure and hard to trace, and someone new who isn't intimately familiar with how things fit together's going to be completely lost trying to troubleshoot a bug that's not in any component but in the interaction between 3 different components (or worse, a bug caused by all 3 components being absolutely correct and bug-free but that particular account's so old it has a combination of settings on it that isn't currently legal and that the documentation doesn't mention).

The permanent staff won't be the cheapest in absolute terms, but they'll be the cheapest in terms of dollars spent for results produced. This isn't a guess, it's a prediction based on the outcome of the vast majority of attempts to replace permanent development teams with contract workers and consulting firms.

Comment Makes sense (Score 2) 91

G+ has always had little of the Facebook-style indiscriminate "let the world see everything" of most social media, users have focused more on specific groups or communities with the conversations going on within that group and not in public view. The changes seem to make sense in focusing in on that rather than trying to be another Facebook or Twitter or SnapChat. It makes the pundits feel left out because they're outside those groups and not seeing the interactions, but that's easy enough to solve if they want to. If they don't... Not My Problem, Man.

Comment Do-it-themselves (Score 3, Insightful) 202

Why would any sane terrorist use any sort of service run by someone else? That just makes them vulnerable. Any sort of PC, install Linux and set up their own private XMPP server, instant fully-encrypted communications without leaving any logs or other traces on anyone else's systems where the authorities could get access to them. And with the authorities' current focus on social media it adds the additional layer of security of not being where anyone's looking for them to be. Geesh, I think government officials have been reading too many best-seller spy novels and listening to too few tech geeks.

Comment Just government IT outsourcing? (Score 1) 85

IMO it's:
s/US Government //
I haven't seen an outsourcing project yet that's been well-managed. Usually it's because management sees the development teams as interchangeable, so they go about managing the outsourced project like they would've their in-house devs. Problem is that your in-house devs you can call into the office and threaten with loss of bonuses and/or job if they aren't getting things done right. You can't do that with the contractors though since they don't work for you and likely aren't even on the same continent and the contract with the outsourcing firm's usually written without any provision for penalties for failure to deliver a product that works correctly and to spec, leaving you with no leverage.

The problem is that management's been taught to look at efficiency over effectiveness. The two aren't the same thing.

Comment Re:Don't have anything for them to find (Score 1) 324

Yes, but if you're dealing with a situation where they'll hold and interrogate you for an extended period even if they find absolutely no evidence at all then you have bigger problems than how to keep them from finding anything. In that situation the only way to avoid this is to not go there in the first place and if you have to go there the question's more along the lines of how do you get in and out without them finding out you're you along the way. And that frankly is seriously out-of-scope for this kind of forum.

Comment Don't have anything for them to find (Score 4, Insightful) 324

Best bet is simply not to have anything for them to find. Store your data on a thumb drive (that you'll carry or ship separately) or upload it to your own server or a service like Google Drive or Dropbox, encrypting it or not first, all depending on how sensitive the information is. Delete it or secure-wipe it or wipe the whole drive and do a complete factory restore on your laptop depending on how invasive you think the search might be. Then let the cops search all they want, they won't find what isn't there.

NB: Linux makes a better platform for this than Windows. On Windows bits of your files can end up in the oddest places to be found during a scan of the drive. On Linux it's easy to set up a separate partition where all your data will go and be certain it didn't leave traces anywhere else, and that partition can be secure-wiped and reformatted without messing up the OS installation in the process. Plus the cops are less likely to be familiar with Linux, and you can play the dumb-non-techie card of "I dunno, it's whatever the guys in IT put on it. I just follow the instructions to run my programs and everything works.".

Comment Re:Wny did they need the certificates? (Score 1) 95

Yep, and I agree with .local, .test and bare names and stuff like localhost not being allowed for commercial CAs. If I used them locally it'd be with my own internal CA for certificates (I have one set up, but that hodge-podge of shell scripts would make you cry).

@sigh Dammit, "The Marching Morons" was supposed to be a satire, not a bloody policy document.

Comment Re:They want to shift the problem to someone else. (Score 1) 291

As someone who's written that code, the problem doesn't lie in the timezone code. It lies in the Posix definition of the time() function, requiring it to return GMT/UTC which has leap seconds in it. Programmers too often treat that as if it were TAI which does not include leap seconds, and bugs pop up when leap seconds make UTC jump relative to TAI. If time() returned TAI directly and the timezone code handled leap seconds everything would be a lot better. I find it amusing that that change wouldn't break much Unix code and would in fact probably fix a lot of subtle bugs by bringing time()'s results into alignment with the assumptions of the code using it. And NTP wouldn't be a problem, conversion from NTP's time back to TAI isn't that difficult. But no, we still have to deal with UTC.

Comment Re:Wny did they need the certificates? (Score 1) 95

True, but at the time that RFC didn't exist. And a lot of software had a hard-coded rule about TLDs: ccTLDs were 2 characters, the generic TLDs were 3 characters and only a few were valid. Trying to use a TLD with more than 3 characters would make some software reject it as invalid, but it was easy to pick a 3-character TLD that was guaranteed not to exist in the global DNS.

Thankfully we've moved past that stage. Though I would like to see a special-use domain "local" defined for names that aren't for testing but are restricted to the organization's network.

Comment Wny did they need the certificates? (Score 4, Insightful) 95

I'd wonder why they needed test certificates at all? For any testing of their systems and software they could use fake domains and organizations located under a domain they own and use just for that purpose (I used the .ttk TLD for that sort of thing for years, back before the gTLD flood). If they were testing issuing of certificates to specific organizations, there wouldn't be any need for them to ever get to servers. I can think of no good reason Symantec would need to have certificates issued to Google, and several bad reasons why an antivirus product would want a certificate that'd be accepted as a genuine certificate for a Web site.

Comment Re:How did Google discover this? (Score 4, Informative) 95

No. It means every CA has to have a log of every EV certificate it's issued, and Chrome is checking any purported-EV certificate it sees against the issuing CA's list. If the certificate really is a valid EV certificate, it'll be in the list. I presume that if the certificate isn't a valid EV certificate (ie. it's not found in the list) and you've got the "Automatically report details of possible security incidents to Google" setting turned on (the default) it sends the error report back to Google for analysis. All of that's perfectly reasonable, and Google only sees information about certificates that're lying about their EV status.

Comment How will it work? Seriously (Score 4, Insightful) 301

It won't stop uploading. Tools like wput and Curl don't read the contents of files before uploading, and wouldn't be modified to support one closed-source feature for one specific file format.

It won't affect Web sites. Web servers don't read the contents of files before serving them, files are just blobs of bytes to the server. The sites of interest to the DRM people are running open-source Web server software too, and I seriously doubt Apache or nginx are going to add closed-source code for one specific file format. IIS would, but it's at best the third-place player in the large-volume-site space.

And finally, it'll be cracked. My bet is that before it becomes widely implemented someone'll crack the system and there'll be browser extensions easily available that simply strip the DRM off the JPEG before uploading, displaying or saving it. Those extensions'll be widely used too, it won't be long before anyone having problems viewing images on Pinterest/Tumblr/Twitter/etc. will just get told to install the extension and it'll fix the problem. Users won't know or care how it fixed it, just that it fixes it.

Comment I'd probably go with Subversion (Score 1) 325

I'd go with Subversion. It's older and has a centralized repository rather than Git's distributed-repositories approach, but that won't be a problem for your team since they aren't spread out across multiple locations. It's got better support for running on Windows (CollabNet sells a supported commercial Windows-based server plus the whole TeamForge line), has Windows clients (both integrated into Explorer and stand-alone) and has supported integration with Visual Studio. Older means that almost every development tool out there for Windows understands how to interact with it. It's also easier for people who aren't familiar with version control to grasp SVN's model and how you interact with it (a commit is a commit, they don't have to understand the differences between their local copy of the repository and the origin copy on the Git server). Finally, SVN offers a degree of centralized control that makes management happy (eg. mandating commit comments in a certain form, controlling individual access to different parts of the directory tree).

To do two things at once is to do neither. -- Publilius Syrus