As sibling says, Visa charges this fee to the merchant, not the customer. Visa is in the transaction-processing business; it's banks that loan money.
But C is a low level language. Not the best tool for writing applications.
Higher level languages and managed runtime systems have gained so much traction for a reason. They are very productive to use. They protect you from simple mistakes. The relieve the burden of memory management. GC simplifies library APIs by making the question of who should dispose of what become irrelevant. We could still be programming in assembly language instead of C. Why aren't we? Why aren't OSes written in assembly? Because C is more productive and higher level. Similarly, there are higher level languages than C, and they have their place. C is not the end all of abstraction.
GC is not about forgetting to free memory. It's about higher level abstraction removing the need for the programmer to do the bookkeeping that the machine can do. Why don't we still program in assembler? Because it's less productive. It's about productivity. As data structures become extremely complex, and get modified over time, keeping track of the ownership responsibility of who is supposed to dispose of what becomes difficult to impossible, and is the source of memory leak bugs. In complex enough programs, you end up re-inventing a poor GC when you could have used one that is the product of decades of research.
The article fails to understand that you can also run out of memory in a program using GC. Just keep allocating stuff without and keeping references to everything you allocate.
Reference Counting is not real GC. Cyclical data structures will never get freed using reference counting.
One of the major, but under-recognized benefits of GC, which the article fails to mention, is that GC allows much simpler ''contracts' in APIs. No longer is memory management part of the 'contract' of an API. It doesn't matter which library or function created an object, nobody needs to worry about who is responsible for disposing of that object. When nobody references the object any more, the GC can gobble it up.
On the subject of Virtual Machines, the article could mention some of the highly aggressive compilation techniques used in JIT compilers. So every method in Java is a virtual call. But a JIT compiler knows when there is only one subclass that implements a particular method and makes all calls to the method non-virtual. If another subclass is loaded (or dynamically created on the fly) the JIT can recompile all methods that call the method such that they are now virtual calls. Yet still, the JIT may be able to prove that certain calls are always and only to a specific subclass, and so they can be non-virtual.
The JIT compiler in JVM can aggressively inline small functions. But if a class gets reloaded on the fly such an the body of an inlined method changed, the JIT will know to recompile every other method that inlined the changed method. Based on the changes to the method, it may or may not now make sense to inline it -- so the decision on whether to inline the method can change based on actual need.
The HotSpot JVM dynamically profiles code and doesn't waste time and memory compiling methods that do not have any significant effect on the system's overall performance. The profiling can vary depending on factors that vary from system to system, and could not be predicted in advance when using a static compiler. The JIT compiler can compile your method using instructions that happen to exist on the current microprocessor at runtime -- something that could not be determined in advance with a static compiler.
All of this may seem very complex. But it's why big Java systems run so darn fast. Not very many languages can have tens or even hundreds of gigabytes (yes GB) of heap with GC pause times of 10 ms. Yes, it may need six times the amount of memory, but for the overall benefits of speed, the cost of memory is cheap.
SF is the abbreviation for "real science fiction". SciFi is the abbreviation for action/horror movies with futuristic explosions. Harlan Ellison suggests "skiffy" as the pronunciation of the latter, and some have taken to writing it that way too. I hear Edge of Tomorrow was actually good SF, but I haven't seen it yet - but 1 a year is lucky for SF films.
Plus you have films like Gravity, which wasn't even SciFi, but instead a historical period piece. Remember when we had shuttles, and the will to build vehicles that could launch men into space? Good times; good times.
If your costs for a box are $1 production, $1000 R&D, then your replacement costs for shrinkage and shipping damage and so on are: $1/box. This guy mightt be 1 lost sale, but he certainly isn't 40.
The customer didn't print special cards here - they're just normal, expired cards.
The store doesn't call the number on the back of the card - the store calls their own merchant bank.
This was just straightforward grift (a con game), not some glaring flaw in the banking system. The sales clerks got suckered, perhaps due to lack of training by Apple, or perhaps the con-man was just that good.
The truth is that credit card interest is the highest profit gig in the whole world. Because of this, Visa/MasterCard
Visa/MasterCard make $0 off of interest. They charge a fee for the convenience of not having to use cash. They're not in the "loaning money" business at all, and of course TFS talks about debit cards, not credit cards.
Vendors are not even allowed to do things like require an ID, (I know they do, but it is against the vendor agreement), even though it would make purchases a lot more secure, because EASY trumps everything, EASY makes billions.
Easy is what the customers want. For normal fraud with actual credit cards (nothing to do with this story, of course), it's the merchant who eats the fraud for ID theft. But merchants sign up for that, because they'll have less business if they're inconvenient for their customers.
Security is not the primary goal here, nor should it be. The only goal of any security here is to limit losses system-wide to something manageable. And it does that just fine.
Even when the existence of the UK was largely no longer in question, the British firebombed Dresden punitively.
You could even say copyright enforcement is mutually exclusive with justice and proportionality.
Don't believe me? Just ask anyone who has been hit over the head with a computer.
Yeah, the good ol' times. When it was other users that trolled you, not the page owners themselves...
WiFi vs powerline networking is very much house-specific. Modern WiFi works great in most places these days, but it matters what's in the walls between here and there. Powerline networking works great in some places, but other are wired so there there's no signal at all between certain rooms, because of deliberate isolation (only heard o fthat in newer houses). One or the other is pretty likely to work for you, if you can't get GbE wired where you need it. At least try to wire up your most-used TV.
I use the Handbrake command line to rip these days, because of the DVD/BR trickery, language choices, and so on. With a few scripts I get the audio and subtitle tracks I want, reliably. I have a little program to scan the DVD (using handbrake) and generate a ripping script, needing as input only whether it's an animated title, which 90% of the time I can just run. It's only the stupid BluRays where there are many bogus titles that it takes me more than a few seconds of effort these days.
BTW, Disney isn't the main villain when it comes to the fake BR tracks, that's LionsGate. I haven't seen that from Disney for some time now (Disney cartoons do typically have 1 title choice per language, since they localize on-screen writing and sometimes the credits, but that's just a few titles to sort out, not the 9-title puzzle box).
Apple didn't come from behind in the smartphone market. They created the market. Microsoft and Blackberry had the bulk of the market share, both based on old OSes that had been stagnant for quite a while with no real innovation. Blackberry didn't even offer a touch screen device yet, and Microsoft's could hardly even be used without a stylus.
Apple introduced revolutionary new hardware - capacitive based multitouch technology - which IMO was one of the primary reasons for the success of the iPhone. The other was an OS UI built from the ground up for touch interface. That was a knockout combination.
So no, Tizen doesn't have much chance unless they can bring revolutionary advancements, either hardware or software, like Apple did (and they brought both at the same time).
Gold notes were just as virtual as anything else. Physical gold coins, or barter for consumables, is the only way to avoid virtuality, and there were many practical reasons we went away from that. Nothing, of course, will prevent a government from debasing a currency - it's what they do, it's all they do.