I'm talking unnecessary use of tech that gets us nothing in return.
The major source for this exploit is going to be remote update of PLC firmware/logic. By remote, I don't mean "internet" but simply physically remote. This does give you something from a plant operation point of view as it is simply one more task that's automated, and potentially a very expensive task due to the dollars per hour for the task in question. So the tech does get you something, at the increased cost of additional "risk". These risks should be mitigable (security into the network, security of the network, security of the device), but it is certainly true that PLC level security is not what it should be. I expect we'll see things rapidly improve on this front as the IOT mass proliferation leads to many more problems, and hence more focus on the inherent weaknesses in this area, and hence forcing the big players to invest the necessary dollars in making their devices securable. SCADA and industrial automation has simply benefited from obscurity for a very long time and thus not had the selection pressure forcing the required evolution.
Let's say Theranos created a really slick USB device that lets a user do a blood test from their computer (stop laughing, it could happen).
Who's laughing? This already exists: I would hazard a guess that there are all kinds of patents around this tech. Seeing as the real innovation looks to be in the cool analog silicon interpreting DNA thing, I'd agree that the "driver" part is pretty much irrelevant.
...is my motivation to work in such a system?
If I do nothing, but am guaranteed a minimum basic income that lets me live, why should I work?
The motivation to work is much more than for simple survival alone. Now granted, when survival is at stake, motivation is going to be very high, and you can get all kinds of people to do unpleasant things in exchange for continuing to exist. But this is not the reality that we (the general
Yes, there are trade-offs in both directions. Apple made the choice for ref-counting so that you have a consistent UI experience and generally lower memory overhead. For phone based apps I think this is the right choice. By the way, I don't think you can say that a GC is faster, just that it can be faster, in the real-world it can also be slower depending on how it's being used.
Personally I find GC to be a pain the rear due to the fact that it's generally a lot harder to resolve GC performance issues that are affecting you than RC problems. I think that for strongly typed languages a GC is the wrong choice, but it's certainly not a black and white issue, and most of the time it really doesn't matter.
Attitudes like yours really piss me off. Who is trying to push an agenda here? Your calling me a "wowser" whatever the *fuck* that is, because I "dared" to call out the bullshit that is your happy world view that if we just got the "man" out of everything then it would all be better. Well, news flash dickhead, the world and the universe does not give a fuck about what you think. Societies are too fucking complicated to model, so simplistic solutions such as "oh, if it were legal then everything would be better", don't always work out as planned. Smart people then look for something else.
It is better to give than to lend, and it costs about the same.