Maybe he was just blinded by science.
Maybe he was just blinded by science.
Distributed ledgers have some value, but there are not many applications where the cost of the bitcoin approach is justified. All this talk of the blockchain in the finance industry is interesting but frankly smacks a bit too much of "me too" bandwagonism for my liking. I really struggle to understand the benefits of a distributed ledger in most financial transactions. Certainly can't understand the value with latency and volume constraints like the current bitcoin implementation.
I think public key cryptography is _vastly_ more important than the blockchain to name just one.
It's a complicated question that presents me with difficulties. Let us assume that we live in a country with separate Executive, Judicial and Legislative powers. Despite failings, this is largely true of the UK. If the executive (police etc) want to spy on someone they need a legislative authority and I would like them to have a second, independent, step by which someone evaluates if the purpose of their spying is within the legislative authority. That would be a judge. I am not convinced that the Home Secretary (which is an Executive position) is the right institution to be conducting this evaluation. A judicial oversight would be more comforting methinks.
I don't have a problem with the state hacking for the purposes of investigation. Placing the existence of this capability into the public domain certainly impacts the probative value of information found on a device (the planting of false evidence becoming likewise easier). This is akin to weight of the finding of physical evidence with the probability of the planting of false physical evidence with the warranted access to a suspect's property or person. Corruption is the problem here, not the means by which it is effected.
What concerns me most of all is the creation of legal processes which are not subject to the scrutiny of public view. It is this issue that should be at the top of all the agitation about the progress of these courses of action. Secret courts or injunctions, the existence of which cannot be mentioned are frightening and indeed so Kafkaesque as to be worthy of new round of parable fiction.
(51% of the water in CA is given to animal agriculture.)
Are you sure? That number seems well out of whack from my understanding of how water is used in most agricultural water systems. First you probably mean that as a percentage of the water consumed because it is unlikely that more than 50% of the water in California is consumed, most of it will be used to manage the system itself (checking facts.... yep... http://www.scpr.org/news/2015/...). So once you correct for that detail and turn to agriculture, fixed plantings and cropping are metered and use giga litres per annum but livestock water is such an insignificant amount that it's not even metered (as long as the pipe is small enough). Perhaps in the US (and the big valley in particular) feed is a big part of that cropping.... rudimentary googling suggests it is nearer to 25% than 50% and that includes alfalfa or nearer to 10% if you are measuring irrigated pastures. It's a bit different where I am from since we don't usually irrigate pasture except for dairy use.
I wholeheartedly disagree with almost everything you say, but if you are going to run the argument you may as well use facts a little closer to the reality. Who knows your argument might even hold water for some folk under those condition, if you will excuse the pun.
I am an econometrician (well sort of), which is probably worse, but at least we know that. But economics, independent of any data set availability or actual method problems, is broadly handicapped by the generally unobservable nature of the actual data that would enable the verification (or refutation) of a hypothesis. That is, much of the data is quite noisy with many variables mixed in with each other, and as such a big part of the work is trying to determine the extent to which the data itself is a useful measure of the thing being tested. Sometimes getting to a useful dataset is dependent on some awkward assumptions. As such, one of the biggest faults of Economic Theory is assuming a can opener (https://en.wikipedia.org/wiki/Assume_a_can_opener).
As an outsider, who writes software for a living (proper, highly available transactional systems [finance industry but I do know some general stuff]). This amount of money is simply staggering. Even if we assume the published number (4.3B), 3% inflation, a relatively aggressive annualised ROI and 10 years over which to apply the costs, that turns into between 80 and 160 [20% ROI to 10% _annual_ ROI] million dollars per year in costs. IN COSTS. Even if you margin those costs at 33% (profit is already accounted for so the margin is on costs and risk) that's still 50 - 100 million dollars a year of costs to develop and support this system every year for 10 years. WTF kind of project are they planning? People have written software that changed the freaking world with a fraction of that amount of money.
Now having said all that, I have a little window on the way a different government developed their budget for an IT project. They knew that the new project would make 60 people redundant so they looked at the cost of those people, multiplied it by some number of years for the scope of the new system and went... There you go 30 million dollars.
There is something very, very, very wrong with government.
BTW, There are about 20M veterans in the USA, give em all 200 bucks and let them keep scans of their own records on a freaking thumb drive. Backed up to, S3 or something. That might even actually work!
Medallion owners bought the medallions with the understanding that they were buying into a limited monopoly.
Maybe it should be clarified here that when you see someone claim that it's not the government charging $200,000 for a taxi medallion, that's just the going price on the secondary market. You know, good old capitalism, where people are bidding up the price of a __un__necessarily limited commodity.
The taxi authority looks at population, traffic flow and transportation needs and comes up with a number of taxis that they think should be on the street. Every year, they add new medallions into the system, usually with a lottery. The idea is not so much to protect the cab drivers (cities don't care about cab drivers. If they did, they wouldn't make the minor traffic fines, like your cab being 10 inches over the line of a designated taxi waiting zone, as much as $500 (which practically wipes out the cab driver's week), but to keep the number of taxis from getting so crazy that you have cabs clogging up city centers, fighting for fares.
There you go, I fixed that for you.
If the regulators approach to the problem described was the correct one then why can't I get a fucking cab when I want one? There are many more solutions to the problem of oversupply that you identify, indeed one can quite happily argue that Uber actually have one.
I don't want my forth choice getting in. At some point I might want to waste my vote instead of having it count towards the lesser of two evils. But the system in Australia doesn't allow for that (anymore).
Even under the old regime your empty vote was still a vote for the ones you don't want because once your paper expired it was removed from the pool of votes making everyone elses vote count a little bit more from that point onwards.
Except that it only works because where you have no compulsory voting they have no reason to record "who voted" and as such this fraud is trivial. If everyone has to vote then the mechanism that checks your compliance also checks others fraud. Now, in most places it is not perfect (where I am from we don't even have to give id of any kind just your name) and so the attempt at a fraudulent vote will almost certainly get past an initial hurdle of getting the ballot paper into the box. However, the system has a number of natural checks that detect the fraud at later stages of counting/reconciliation of rolls. Such as, total number of ballots cannot be greater than the number of registered voters, collating the rolls from the multiple voting centres and checking for duplicates. In places like India, they stain the finger of a voter to ensure they do not vote more than once. etc etc
CV does not guarantee the absence of fraud (mostly it's old people who forget they have already voted) but it is _vastly_ reduced simply because of the nature of what CV means for the election as a whole.
One big problem with this plan for democrats: Voters would have to present ID to get credit for voting.
Nope, not a problem. I live in a compulsory voting regime. I show up in my electorate (now there's the trick), I give my name (no id) to the person who will give me my ballot paper. They cross it off from a big book (well a set of books, organised by family name). If everyone votes, then proving who you are is less of an issue because if you go to vote and your name has already been crossed off then there is a problem. At the end of the process they check (probably scan) the books for the absent voters, check those against the postal votes/absentee votes and then proceed with enforcement (such as it is).
They probably don't even do any of that until the result for that electorate is within the tolerance of the missing/absentee/questionable votes.
With 200m electors in a presidential election (even given the electoral colleges) you might do better with something a little more electronic. But the key is you don't need ID if everyone votes because everyone that has suffrage is just in the book and you only care about double ups and no shows.
I am a big fan of consumption taxes, however the tax is of itself a regressive tax. The prebate approach is the same that is used in the VAT world, it's just that the household is allowed to treat it's "necessities" as inputs and as such claim the CT, GST, VAT whatever you want to call it on those inputs. It's an interesting one. I have mixed feelings about the approach but it is certainly one way of addressing the inequities of flat consumption taxes.
And everyone in the UK speaking world dies laughing / wretching.
Is what I have used. I have found it quite adequate. Hooked up a graphics tablet to my machine and I could draw like a 3year old. With practice I might get better.
The content providers license the program to a distributor in the region for >$. In Oz, for example, it is the paytv operator, which then uses that "desirable" program as a draw card for subscribers and hence advertising dollars and hopefully 6) $profit. If people can legitimately buy it from netflix they don't need the paytv intermediary.
So the content providers (HBO et al ) won't license, say, "Trade of Toilets" or "Zombie Apocalypse series 13" to Netflix in Australia since they are already contractually bound to FoxTel (the provider). They will probably always get a better price from the network distributor than the sum of the paid views from Netflix*.
*How soon before that changes? I expect that "unbundling" and IPTV will be the death of these deals so perhaps this is all a shortish term issue.
More than 20 years ago I had a full and frank exchange with a macweenie friend of mine where I posited that in the vast majority of cases the core "functionality" of the work we were doing was already within the capacity of the processors available at that time and the advances in speed that will come in the future will all be about enhancing the user experience of that core.
What I meant was that the calculating of the spreadsheet cells or redrawing the document window or
The parent poster is absolutely right, this trend is ongoing and the amount of "work" that I can get my compute resources to do via more and more sophisticated interactions is only going to increase and the more encompassing that work becomes the more it can be broken down into smaller discrete and hence parallelizable tasks.
Having said all that.... my professional expertise is in quite high performance transactional software and Linus statement is absolutely true. I'll take cache size/control over a proliferation of cores any day, given a certain number of cores and within that all the goodness of branch prediction and ooo execution, four sounds about right. So much so that, we find situations where adding cores actually reduces our performance we suspect due to caching issues.
So in essence there are two trends. Form Linus's perspective he is right, the time spent on parallelism is not worth it. At a more macro level it is. Perhaps that macro level is n application software level rather than a system software level and hence the difference in view point.
"You're a creature of the night, Michael. Wait'll Mom hears about this." -- from the movie "The Lost Boys"