Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Comment Maybe an automatic traffic "shaping" appliance? (Score 2) 50

I used to work for Blizzard Europe customer support, occasionally during my time there Virgin customers in the UK would complain of terrible latency and each time it was a result of some traffic shaping appliance that Virgin was using, which was incorrectly categorizing WoW traffic as being peer-to-peer. This was over 3 years ago, but happened at least 3 times during my time there.

However, as Apple traffic is likely over HTTP(S), it seems less likely to be mistaken as "questionably illegal traffic" and is probably more likely a peering issue of sorts. Perhaps the CDN servers that Apple uses are at a different ISP and there is a routing / peering issue making it all route internationally instead.

Comment Possibly a band aid for poor code organisation (Score 2) 126

The article talks about how this makes them more effective as a team because they suffer less pain with integration and merge conflicts. This makes me think they are working with a very poorly organised code base, or as a team don't practice use of interfaces/contracts effectively. If they inherited a code base from someone else, then it may be largely unavoidable, but if it's always been their own code, then in my mind it's a bad smell.

I have been working on a project for 2 years with a team that is 7 developers which is in on time and the client just finished their final testing today with no significant bugs left outstanding. As a team, we were almost always able to each work independently on tasks due to early establishment of good patterns and architecture as well as regular design sessions for upcoming work by our two most senior team members. Each team member would typically push their SCM (Git) changes up to several times a day and merge conflicts were rare and practically always easy to solve. We even changed to better patterns 4 months into the project and due to good layout of our code, we could run the new patterns side by side with the old ones.

So how did we largely avoid stepping on each other's toes? Code was properly arranged so that areas of concern were appropriately isolated. When working on a front end use case, the only area of contention was typically registration of a module, after that all your code would be in its own independent area which no one else was working on. If it was a large use case with lots of screens, one developer spends 20 minutes stubbing out the code layout, commits, pushes to Git, then everyone else pulls and goes on their merry way. If use cases need to interact with each other, then one developer ensures the interface is correct, which may involve a 10 minute talk with a fellow developer, the interface is immediately updated and pushed into the repository and again each developer goes back along their merry way.

You may have noticed me mention that we push quickly into our repository, the article mentioned developers trying to merge a week of work and that terrifies me. If code is properly isolated, then your work can be incomplete, but not affect the existing working code, or other people's work in progress, so you should push *at least* once a day. If you need to alter an interface on your side for a colleague, but you were in the middle of something and weren't really in a position to push your code, then you: Git stash, update interface file, stub out implementation file if needed, push, then Git pop, you can carry on and so can your colleague. Additionally, our CI environment (Jenkins) is building and running tests on every commit, if someone pushes something that breaks the build the whole team is notified and it's fixed within 15 minutes.

Another comment has already mentioned that the quality of code would be higher with the team approach, but in my experience, as long as the code is reasonably efficient and more importantly, easily understandable to anyone else looking at it, improving quality beyond that in most circumstances adds little to no real value for our client. Since easy to understand code is mandatory, in the event it needs a bug fix, or in the rare case it needs to improved, perhaps for performance reasons, it's not especially hard for any developer in the team to do so.

Of course it will be different working on our code in 2 years time from now for ongoing improvements, but as we have been doing for the past two years any time we feel our process isn't working well, we will discuss it during our regular retrospectives and adjust our process to cope with the problem, could that process result in team programming like in the article? Possibly, but I highly doubt it.

Comment Re:For all of you USA haters out there: (Score 1) 378

I agree with you in principle that I would prefer a small amount of my fees act kind of act like "fraud insurance" so that I don't suffer the loss personally. But what would be nice if all organisations would make more of an effort to reduce fraud. But what happens instead is that when fraud is costing them too much, all they do is raise their customer's fees. I would really prefer not to be paying extra fees which do nothing more than go into the pockets of fraudsters.

Comment Re:For all of you USA haters out there: (Score 1) 378

Since the losses due to card fraud are almost entirely borne by the banks, I have to assume it is more cost effective to take the losses than to chip all of the cards.

And do you suppose the bank's employees pay for the fraud out their own salaries? Of course not! The cost of fraud is paid by their honest customer's banking fees. Even if you as a customer get refunded by the bank, when a fraudulent transaction occurs on your account, the money has to come from somewhere.

Once one realizes this, then they realize that the banks have no incentive to pay for improved security, hence, the only reason that US banks haven't improved their security is because they would rather raise customer's bank fees instead of making the effort to try elliminate the problem.

Comment Re:Predestination is an incredibly unsatisfying mo (Score 1) 254

Excellent food or thought, this does somewhat redeem the plot in my opinion.

I'm not sure if the original story did something to draw the readers attention to this analogy with the existence of the universe, but I don't recall that happening in the movie, had the screenplay writers done so, I may have been feeling more thoughtful afterwards, rather than mostly frustrated.

Donnie Darko for example was weird and arguably paradoxical, but the way it was done left me feeling very thoughtful at the end, which I appreciate, even after a few watches.

Comment Re:Perpendoxes, not paradoxes. (Score 1) 254

Very interesting comment, having read it, I would indeed now rather say it is a stable time loop rather than a conventional paradox.

gurps_npc's comment does brings a minor amount of redemption to the plot, if that was indeed the intent of the story.

I still feel somewhat frustrated that it is a nevertheless impossible to have arrived at scenario, but yes, the existence of the universe is no less of a conundrum.

Comment Predestination is an incredibly unsatisfying movie (Score 3, Interesting) 254

I watched this film and and followed the plot fine, but was left feeling very unsatisfied at the end of it.

Most movies involving time travel generally try avoid paradoxes or major plot holes, but with Predistination actively embraces time travel paradoxes, taking them to the extreme.

Maybe someone thought it would make for a "deep" and clever plot, and I had no problem following it, but as I understand it completely, I just felt frustrated with it in the end, because, the science fiction of time travel aside, it's an impossible scenario with no logical resolution.

Anyway, without posting major spoilers I won't say anything more.

Comment Re:Why not a master password for the PW manager? (Score 2) 113

You just happen to be super vigilant with your security and if Chrome had implemented a Firefox style password protected password manager it most certainly would not have met your needs either. You are very different from the vast majority of users and the most worthwhile measure you take above Firefox and Chrome, is that you compartmentalise your passwords. You however are a part of a very small number of people who go to those lengths and for the vast majority of users who have all their passwords in the same "vault", they would expose all their passwords within a day, making Chrome's strategy of leveraging Window's API arguably more secure than building their own. And keep in mind the vast majority of people would be infected for weeks or even months before they notice.

As for your argument about key loggers being "harder" to develop than other malware, keep in mind that a lot of malware these days is bought as a kit with a tonne of features. The people writing the malware are typically separate from the parties utilising the malware and once a password stealing module is written, it's available for everyone else to use, regardless of how hard it was write. Also, who said it had to be a key logger? It could be sniffing unencrypted memory, peeking forms in the browser window, it could be watching in countless different ways to avoid being detected as a key logger by AV.

And in regards to AV watching for key loggers, if they know to watch for key logger type activity, then it stands to reason they could also log attempts to read the password management API. In practice it's a cat and mouse game, as AV writers work to detect malware activity, malware writers work to avoid detection.

Malware writers are financially incentivised to come up with solutions, do not think that the hurdle required to get key sniffing is substantially different to that required for using the Windows API for password management, if it takes them a couple of weeks more to write one method, they might bill their clients more, or perhaps they are forced to include the feature so their clients don't use a competing product.

While you are a rare exception as you take extraordinary lengths to protect your credentials, for the vast majority of people, once they have malware, everything on their user profile is likely compromised and single password vault vs Windows API won't help them one bit, except that the Microsoft developed password vault is more convenient to users and likely better than a comparatively simple solution which would ship with a browser.

Comment Re:Why not a master password for the PW manager? (Score 1) 113

If you are infected with malware, that malware could just as easily watch the password you type into a password manager, if anything, for Windows users, using the supported, well tested and proven Microsoft APIs is likely to be much better than Google trying to reinvent a wheel, which at best would still not be quite as convenient for users.

Comment Re:Why not a master password for the PW manager? (Score 2) 113

Once you have any kind of malware on your computer, you have to assume anything you do within the context of that user account is compromised. Any malware which can read your password database could also just as easily be watching your activity and record the password the next time you enter a global password into a password manager.

As a user who is already used to quickly pressing Win+L to lock their computer each time they leave their desk, leveraging the Windows APIs is exceptionally convenient, especially when I consider that I don't have to manage yet another password independently of my Windows login password.

Also, those of us who recognise that it's no longer mid-2000 and that Microsoft has become a company who arguably sets one of the best examples on how to develop software securely, I have confidence that their API for this is thoroughly tested and proven. For Google to even attempt to come close, they would need to expend considerable effort which would ultimately achieve, at best, a reinvented wheel which would also be less convenient for Windows users.

Comment Re:April Fools! (Score 1) 162

Our project's team decided to move from SVN to Git a few months back. We develop for .NET and were all used to working with TortoiseSVN with code being managed on a server which could control access to different repositories.

We had one guy who recently joined our team who knew Git and felt it was worth taking the plunge and moving to it, acknowledging that we would initially be frustrated at having to learn a new tool.
We use TortoiseGit along with Gitblit to host the repositories and at this point I have to say I am super happy we made the move. Learning something new is always a little painful, but it was well worth it in this case. If you're used to TortoiseSVN, then TortoiseGit really helps and I personally have not had to use a single Git command.

Git empowers you more as a developer because while SVN essentially forces its changes onto you as you fetch latest, with Git, you get much greater control in how and when you merge your changes with the repository. If you are uneasy about a merge, you can make a branch in just a few seconds and test it there first. The nicest though is how you can commit locally without having to push your changes to other users, this is especially useful if you are doing a refactor and want the ability to create rollback points every hour, but don't want other developers getting your not yet complete work. You create a branch locally, commit every 30 minutes or hour, then when the whole task is completed, you can merge your commits into one (if you want), then push to the central repository for the rest of the team.

If your refactor took a week, you can avoid the merge pain of other developers work by regularly pulling their changes into your perhaps every day or even every hour, and everytime you want to merge, you can roll it back if something turns our badly.

The thing to understand about Git is that there is no "central" repository authority like with SVN, instead everyone has there own repository and Git provides a nice way to selectively pull and push changes between different repositories in a way that you have much greater control over. In our corporate environment, we do use a central repository as that's where the backups happen and it's also much easier than trying to sync with peers. The end result is a process that in practice can work identically to SVN, but also gives developers greater power on their own computers, if they want it.

It really does empower you, but as with anything truly worth doing, there is effort required and you must be prepared to invest. I also recommended that at least one person on your team is already familiar with Git as an in person explanation to any issue you have is much faster than trying to research it online.

Comment Re:What a Load of Bullcrap! (Score 1) 1199

Your nickname suits you very well.

Cigarette smokers who do not recognize the imposing obnoxiousness of their entirely optional habit and the burden it places on society, are by definition, selfish.

As they have made themselves practically dependent upon their habit they will of course defend it tooth and nail. The very fact they made the completely irrational decision to smoke knowing all the negative impacts of it and then go on to *defend* their irrational decision, leads me to conclude they are either plain stupid or otherwise generally irrational, and hence, cannot be reasoned with.

Another megabytes the dust.