This seems to be quite typical for government consultations. There's very little in the way of rigorous process. I remember years ago in the UK there was some poll that showed people were worried about anti-money laundering laws and their effect on freedom and civil liberties (it was a poll about risks to civil liberties, Ithink). So the British government said they'd respond to this by ordering a consultation on how best to improve Britain's AML laws. They invited public comments, etc. 6 months later the consultation was published and it recommended making the laws even stricter. There was absolutely no evidence-based approach used at all.
I invented the "One Click" wheel.
Pfft. I invented the wheel with round corners.
And I was writing a few BASIC programs shortly thereafter. But they are today what I would call TRIVIAL. Things that I would do in a single method of a modern language. With much better style, correctness, comprehensibility and maintainability.
Having just learned programming myself doesn't mean I was by any means an expert ready to work on big commercial problems worth lots of money. It took years more to learn a lot of important things. Structured Programming (aka giving up GOTO). Encapsulation. Information hiding. Data structures and dynamic memory. Algorithms. Understanding performance classification of algorithms. Understanding how the machine works at the low level. Writing toy or elementary compilers. Learning a LISP language (pick any one, they will teach you the same important and valuable lessons). Learning databases. How they work as well as how to use them. Read a few good books on human interface design before building a complex GUI program. I could go on and on.
> You can't learn how to build a highly optimised, always available, secure e-commerce trading platform in 8 hours.
Correct. The point here I think is that to have all of the valuable skills that makes you good at something, and fast at it, and apparently able to recognize the solutions to problems very quickly is -- lots and lots of study and practice. Years of learning. Failures (hopefully on some of your own toy problems first rather than commercial ones). Figuring out how to debug complex systems -- without or prior to the existence of source level debuggers.
I don't have a lot of sympathy for those who cry because employers want skilled programmers. Well, professional sports teams want skilled players. And modelling agencies want beautiful people. These things come with some combination of luck of the draw and effort to take advantage of it. (Those models don't eat donuts, for example.) I also think computer geeks should be able to cry and whine that humanities studies are unfair.
How much of the grueling training is done simply to be grueling and exclude people based on their lack of stamina? Think of law school assignments where they throw a 100 page brief at you Friday to be handed in Monday that requires analyzing dozens of circuit, appeals and Supreme Court decisions, maybe a few hundred pages of congressional record to determine intent and then some history for context? Or the marathon race of medical residency where 100 hours is a normal week and 36 hours straight is a standard shift?
I think in some sense these kinds of things are done not because they make the profession any better but because they are exclusionary and keep the pool of competitors smaller. If you look at less exclusive jobs that need to be done right in organizations that depend on them being done right you see training done for results in a saner fashion vs. some kind of weird torture test.
> In the old days there was a respected profession of application programming.
> There was a minority of elite system programmers who built infrastructure and tools
> that empowered the majority of application programmers.
I think it is still that way. But now there is a third class who think that breaking into the application programming is some kind of godlike elite skill because it requires you to actually know more than the mere syntax of a language. Programming is racist and sexist because it requires you to even learn the syntax of a programming language. Why can't the computer just do what they say? Why do they need a special language? Why should it be necessary to learn to design complex databases, and understand in memory data structures and algorithms? Why focus on gaining lots of insight in order to come up with vastly superior algorithms?
In short, from what I see on some programming boards, what some people seem to want is a high paying position where an untrained monkey could get a computer to do what the boss wants, and then collect a paycheck -- um, no. Direct deposit.
They want to be able to be a programming superstar by reading a book such as:
* Learn Programming in 24 Hours!
* Learn Brain Surgery in 24 Hours!
* Learn Rocket Science in 24 Hours!
* Learn To Be A Concert Pianist in 10 EASY Lessons!
Various programming boards are flooded with people who want to know how to break into programming for big bucks, quick, overnight, but don't want to actually do the hard learning.
The next eruption, if it happens within the next couple of years, will be blamed on this experiment. This will happen regardless of any scientific support for such blame.
What do these systems cost without the inbuilt subsidies that monetize your information?
I'm presuming they seem attractive to people generally because they seem to be inexpensive. Some of this low cost is due to the ever-decreasing costs of the hardware, both in terms of on-site devices (eg, cameras, sensors) and the back end "cloud services" that enable end-user analytics and web connectivity. But a lot of this cheapness seems to involve subsidies provided by monetizing the information they gather and selling it to third parties.
I'm curious what these services would cost if they were offered without any monetization. Would they be cheap enough to be appealing?
I'm mostly thinking of turnkey solutions, not DIY systems where people cobble together their own collection of hardware and software. These may be cheap in dollar cost outlay but if you factor in the cost of labor, time and expertise are pretty expensive and not available to most people.
I won't knock what you're doing but I'm curious what you get out of it that you couldn't get out of a Rand McNally trucker's road atlas and a dedicated GPS.
The dedicated GPS would give you turn-turn directions without any data service and the atlas would give you decent printed maps for most highway planning.
As kids in the 70s we covered most of the Deep South and Eastern Seaboard in an RV with just a paper map. I don't remember us getting lost and we sure seemed to spend a lot of time off the beaten path.
I suppose the trip planning part would be OK if you were really compulsive about it, but it seems like a lot of work.
Grandparent post still stands. These reviewers will gradually be replaced by others who never had any experience scheduling the work.
I hadn't caught that, thanks. This makes the system a lot more interesting.
That's backwards. Philosophy only solves its nagging questions by *resorting* to math or logic or science.
The trend suggests that in the limiting case, humans are not intelligent.
When Netflix was just a DVD service, keeping up with the star ratings of movies you had watched wasn't hard. You'd log into the web site to manage your queue anyway and clicking on the ratings was simple.
Now so many people watch things via streaming that it's easy to not do it (and so many STBs make it difficult/awkward to rate anyway). Plus I'd bet that much of the streaming viewing is series where rating kind of falls apart because you might watch a single show for a couple of weeks and you lose opportunities to rate many titles since series have a single rating.
It makes me wonder if the suggestion algorithm ever included the critical quality of the movie or if it just included the user ratings. If critical quality was never a factor, skewing the movie base with bad titles makes it seem less effective, especially to a user who may have already taken into account general critical reviews because they see Netflix just pushing bad direct to video titles.
If users are spending more time watching series, not rating due to streaming changing their interactions with the rating system and the recommendation engine not taking into account movie quality it's even easier to see how recommendations become increasingly useless.
I remember people said the same about smartphones. Waah, the battery only lasts a day, I'll never use one of those. Somehow smartphones still took over the world. People do go to sleep every night - a nice cordless charging stand seems like a relatively small issue if the devices are genuinely useful.