Given that it's widespread across huge numbers of people, presumably of all kinds and intelligence levels, I think that dismissing the problem as being because people are too lazy/stupid is...well....lazy and stupid.
Remember that people treat their computers like a social being - and a subordinate one at that. Every morning, someone will go and sit down at their office computer and find it's forgotten who he is, even though it sees him every day. He can walk away for an hour and it'll forget again. It'll fail to understand that he's him over and over again as he uses websites, servers, etc, stopping each time to refuse his instructions and demand that he perform some silly little task purely to help the computer out in functioning correctly: remember an irrelevant string of nonsense. And, very occasionally, the computer will fail and do something like send banking details to someone in Russia, or show his ex-wife his e-mails to his lawyer.....even though it's blatantly obvious to even an imbecile that these are the wrong things to do.
We all know that computers are unintelligent tools that are not capable of doing better than this - on slashdot, at least. But it still feels like talking to a forgetful, obstructive, naive, reckless, stupid and insubordinate little shit. Even the most stupid of assistants should be expected to do better most of the time.
People can certainly do better, but we have to accept that humans behave like humans and recognize that we're going to need to improve the technology as well as people's habits. In the short term that could mean things like providing ways to generate secure passphrases and asking them to write them down, using authentication devices and using UIs to promote better practices....and we need security researchers who stop looking a memory dumps for a while and look for more secure ways to interact with users.
It usually works like this: The official exchange rate gives you, say, X10 per $1 and the black market rate X100 per $1. (Which happens because the official rate is not one which balances supply and demand). So, everyone wants to sell $s at the black market rate and buy at the official rate. The government then forces everyone they can who has dollars to sell - legal exporters and their customers, for example - to do so at the official exchange rate. This gives them a small source of official dollars. Most people can't get these, there are too few, so they go to people the government likes. eg, bribe-givers, political allies, ruling politicians and their families, etc. Those lucky people can then immediately sell them on the black market at huge profit, or import goods very cheaply which they can either use or sell at huge profit. You could also use it simply to bankrupt all of the private companies you don't like to support the ones you do (until you run out, of course).
I don't know any specifics about the Venezuelan government, but governments doing this usually have no incentive to fix it. Much easier to use it as a source of power, wealth and patronage.
Both sides assert that the islands are theirs because it's popular. Everyone likes to see their country have shiny things, especially if it means getting one over on another country. And it's a convenient (yuck) patriotic distraction from the governments' other failings. And the war made it much harder for their to be any sort of peaceful settlement...when hundreds of people die defending something it gets a lot harder for politicians to explain wanting to go back on their earlier position. (I think people should remember, however, that the islands were attacked by a military dictatorship but now claimed by a democracy(-ish)).
The arguments advanced by the two sides I think have little relation to the real reasons and are never going to get anyone anywhere. Argentinians cite geography and the bit of its history they like. The British cite the bits of history
Meanwhile it makes not the slightest difference to the lives of almost anybody in the two countries.
They'll spend at least a week trying to find and set up desks, obtain computers, work out how everything works in the office, set up their development environments, go through 'if there's a fire walk out of the door' training and so on. That's assuming they're not spending all their time urgently finding a new home, new schools and movers (or just a new job).
Then they'll spend another week or two being really pissed off at the disruption. People generally do get pissed off at change, but it's worse if you're suddenly finding your working day is two hours longer with no extra pay because you have to trudge back and forth to the office (which always feels the worst at first). That'll be worse for some than others because of different distances and because no doubt some will be using extra time and flexibility to drive children to school or nursery. Then they'll be pissed off at having to use work computers which aren't set how they like them, don't have the extra monitor, don't have the special keyboard/mouse they find more comfortable, have an cheap uncomfortable chair they didn't choose carefully for themselves. And, of course, they'll be pissed off at the sudden noise and annoyance that kills developer productivity in offices.
People get used to those things - although quality of life will no doubt be persistently lower, and people don't really get used to increased noise. I can't help thinking that some will have reduced working hours because there just aren't enough hours in the day any more, and that some will be looking for a better deal with other employers.
Working in an office has some advantages, of course....but even if it works perfectly there's going to be a big one-off cost, as well as a bunch of ongoing disadvantages.
I think that might be a bit simplistic....apart from anything else, you'd have to ask 'why should they care so much about that?'. The reasons suggested for why religion exists and is held so strongly are quite numerous, and I'd expect quite a few to be true. For example:
Religion is evolutionarily useful to humans because it helps a group perform acts of high altruism towards each other without becoming unable to perform acts of extreme warfare on the tribe next door with different beliefs. If you think of anything which becomes hotly debated like evolution vs creationism as a potential group marker, you could consider a battle over it in schools to be a battle over a child's group affiliation.
Religions are like mind-viruses that exploit human mental weaknesses, and the successful ones have evolved to do this better than others. One way to be successful is to co-opt humans' moral sense and transmission mechanism. Humans have an urge to transmit their codes of morality, especially to children, and so religions (like Christianity) which make their followers believe that belief is morally good will produce believers who honestly and fervently try very hard to push an environment on children which will make them believe the same. And, of course, morality involves emotions like disgust and admiration that don't disappear just because you realize they're illogical.
Religions were invented as ways to explain in the absence of a better method: to explain how the world is how it is, and also to explain why we have moral feelings. But as it's passed down generations the religious then take it as a reliable source of knowledge and so a challenge to this method of knowledge gathering becomes a challenge to the validity of morality (as they see it).
Religion comes from detecting agency where there is none. When humans see something happening/moving/whatever it's safer to assume something is behind it (like a predator) and run, and so humans are biased towards this. Apply this to trees falling, storms happening, floods, and build from there. So this plays to people's fears that there's something huge and dangerous there you don't want to annoy or challenge. Saying 'you didn't do all this!' in the face of a perceived claim of the opposite is quite a big challenge.