What? I had to read this a couple of times. The historic norm was for a single operating system to serve multiple applications. Only with the advent of distributed computing did it become feasible, and only with commodity hardware did it become cost-effective, to dedicate a system instance to a single application. Specialized systems for special purposes came into use first, but the phenomenon didn't really begin to take off in a general way until around 1995.
Perhaps expertise in education is like any other field. There are operators trying to work the system, and there are pretenders, and there are also many sincerely dedicated people, plus a few geniuses and leaders, and there is a lot of deadwood. Some fields are so exacting that deadwood has little place. So it is in computer science and software engineering, for example. Perhaps in education it's naturally more forgiving, more relaxed. Given a choice between the two, teachers would rather cultivate a learning environment than bring out the rule book, wouldn't they? Perhaps that makes it relatively easy to hop on board and get a free ride, I don't know. How would one fix that?
Agreed. And this is exactly what I mean by "indoctrination", including all the creepy implications that go with it.
Am I maybe channeling a bit of old resentment here? Sure, could be. I'm one of those many people who moved around internationally on occasion during their school years. As a newcomer, you're often reminded of the disadvantages of not having an established place in the order, not knowing the code. On the other hand, you get to be the one who points out that the emperor has no clothes, and you also come to understand that these strange patterns of shared assumption, while arbitrary, are generally innocent. The egregious exceptions are the ones that are deliberately set up by people in power. That's indoctrination - a suppression of critical thought - and we should be highly alert to it.
Okay, it's been half a century since I took a test intended for children entering elementary school. I recognize a few of the sentence forms. Somebody has a certain number of guitar picks and gives some away, no problem. But the bizarre pennies to coffee cup equivalence, what the fuck is up with that? Who thought it was a good idea to assume that young children would know that the sentence in "number sentence" means what the rest of the world generally calls an "equation", or that a "subtraction story" conversely means a word problem? What is a "related subtraction sentence" and how does it differ from an ordinary subtraction sentence? Why are you using passive voice to ask questions of a five-year-old? Why do you think we need cubes to solve a linear equation?
What's meant by the fragmentary term "part I know"? Dude, I have no idea what you know. Try speaking in full sentences, like we're taught in school. Oh, right.
In short, this seems substantially to be a test of cultural indoctrination whose arithmetic pales in comparison to the challenge of getting inside the parochial mind of whoever developed the test. I'd be proud if my child failed this test. It's beyond absurd; I find it positively bigoted. These people need to get out and see more of the world.
I wonder whether the "walkabout" rituals in aboriginal cultures aren't specifically intended to address this phenomenon. According to Joseph Campbell, the ritual often involves a scene in which the men of the community theatrically come to capture the boy and drag him away. He instinctively hides or runs to his mother for protection, but theatrically she is unable to protect him. So off he goes to make the terrifying and irreversible transition to adulthood.
What happens in modern urban cultures where we don't have any such ritual, indeed where the transition to adulthood is deferred until graduation from university or is completely indefinite? The status quo psychological attachment to childhood is sustained for much longer. Perhaps with long familiarity it becomes more difficult to break. But I think that the complex social norms and risk/reward pressures of modern life - acutely evident in Japan - are the biggest factor. No child in his right mind would want to sign on to them.
Length: 31 m Width: 15 m Height: 6,30 m Draft: 1,55 m Weight: 89 t Average speed: 5 knots (9.25 km/h) Surface area of solar modules: 516 m2 PV panel efficiency: 18.8% Installed PV power: 93.5 kW (127.0 HP) Maximal engine power: 120 kW Average engine consumption: 20 kW (26.8 HP)
Your figure of 89t refers to the total ship weight, not battery weight. Your calculations are out by an order of magnitude. The claimed recharge time is two days.
We have to look critically at any promising new technology, if only because the odds are strongly in favor of it failing to take root. Sometimes the underlying ideas are not entirely sustainable, sometimes a competing technology takes the lead (whether or not due to merit, the result is the same), sometimes there isn't enough market uptake to sustain it.
I'm happy to learn that most of my technology bets have paid off, but that's at least in part due to betting conservatively. This isn't cynicism, it's a recognition that when faced with an abundance of choice, one way to reduce the flood to manageable levels is to have good rejection heuristics. What's left can then be examined in depth.
The point you make about the inherent cost of change is valid as well, though if anything you understate it. There are secondary effects which make technology change particularly expensive. Technology rarely stands alone; more often it has to be integrated with current systems and practices as well as emerging ones. What happens during the transition are many small but pervasive changes to accommodate both the old and new technologies. If your existing app has required a lot of apache customization, it will take time to transition it to nginx. Meanwhile the requests for customization don't suddenly come to an end, so now you find yourself tracking twice the number of changes while also managing the platform transition. The net effect is combinatoric.
I managed an AI research lab for a dozen years. Our goals were always pragmatic and our methodology as rigorous as we could make it. We didn't need to engage with the "AI of the gaps" problem because, in the first place, nobody was trying to formulate grand claims, and in the second place, there was plenty of useful work to keep us busy. The main philosophical shift I saw in my time there was toward situated intelligence, but in practical terms there was no big shift for us, because most of what we were doing was situated in sensing and action anyway.
From that experience I've taken away a sense that it makes no more sense to talk about AI in the abstract than it did for the classical philosophers to debate the origins of the universe. Presumably there are answers to these questions, since we have a universe and we have something which we're prepared to agree is intelligence. But it can take a very long time to build up enough empirical knowledge to demonstrate an answer convincingly. It's funny that when that finally happens, all the attendant philosophical paradoxes and conundrums, charming and elegant though they may be, simply fall away. There's a bit of a philosophy of the gaps as well.
Randomness is not a sufficient mechanism for free will in any case. Lots of heuristics in AI are pointedly nondeterministic, but designing them in this way doesn't suddenly confer free will onto them. For example, I think that the process of "simulated annealing" for heuristic search is quite elegant - it guarantees convergence to a solution in constant time, with the solution less likely to be correct in shorter times - but the process is completely explicit and nowhere draws upon something we could reasonably call free will.