You haven't succeeded in establishing a common language. You have to do that first.
You haven't succeeded in establishing a common language. You have to do that first.
As for airbrushing, there was an interesting story run by the BBC a month or two back. Apparently Queen Elizabeth I signed into law the right for theatrical companies to kidnap children off the streets without limit or redress in law. It was apparently boasted that not even the children of nobility were safe.
This isn't the sort of thing that gets a lot of mention in productions set in that period.
Norse-style grave goods were also still deposited at the time, apparently some Londoners considered themselves the true descendants. Probably a bit like modern Wiccans, except with big, sharp pointy swords and large quantities of booze.
Other cults doubtless existed before, then and after.
As for medicines, the oldest actual medical book (not a reprint or modernized version) is from 1772. It isn't old enough to get a feel for really old cures, but it is certainly old enough to get a feel for what people were trying. Hemlock (as a drink and a poultice) is well-known, as was the use of mercury. Tea made from ground ivy doesn't sound good and the treatment that calls for the patient to down half a dram of antimony probably wasn't one that encouraged people to call on the doctor very often.
Fireworks listed in a book from 1776 could not be used today - the compounds and gasses produced are too carcinogenic. Companies hiring pyrotechnic experts back then presumably didn't have to worry about sick leave or pensions.
Head lice, tropical diseases brought in by sailors, personal zoos of deadly animals, cholera epidemics, the occasional plague, religious extremists... No, wait, that's modern life, isn't it? When was the Children's Crusade, anyway?
Just take a survey of all Indian government software licenses. Given the expense and the insanity involved in tracking MS licenses, I'm sure that they could be found to owe at least 3.4 Billion in Licensing and penalty costs.
I'm working on it. Seriously. They dug up an Iron Age settlement not far from where I used to live. My father was one of the scientists on the team, providing magnetometry, ground penetrating radar and mass spectrometry. One of his colleagues from the university provided geological analysis and a colleague from another university handled conservation of things like amber artefacts.
I have all that data (not all of which was released to the public) plus the archaeological reports and a decade of photographs of the site from over a dozen people, plus photographs of very nearby contemporary monuments.
My plan, from the very start, has been to turn this into something educational. My thought has been to construct a virtual reality, much like Virtual Rome (dunno if that is still being run), so that people could see a reasonable reconstruction of what the site looked like - not just in one era, but in each of the eras for which sufficient data exists.
Admittedly, this has turned out to be a very difficult project - well beyond my artistic skills and a very tough challenge for the virtual reality software that has been open sourced. Any help - any at all - would be gratefully received. I tried to get kickstarter funding, to hire the necessary talent, but kickstarter rejected the project outright as too freakish.
If that project fails, I am looking to see what other educational uses I can put the data to.
What you have is two different world views, that lack a single frame of reference to have an honest dialog. Doing anything other than trying to establish such a frame of reference ( which is what Dawkins et all do), is fruitless.
Science doesn't need anything, its science. The last thing I'd hope anyone would try using it to do, would be to prove an un-provable statement. That would seem to be the atheist version of heresy.
I would concur with that. What I have proposed is at the upper limits* of what can be achieved by a single entity running a single entity. If you need finer granularity, more dimensions or greater timelines to give everyone a fair chance in life, no single entity (corporate or government) could do it.
*It may actually be beyond. Not financially, but organizationally. To predict the optimum path for each student individually, track that, and correct at a moment's notice, no entity has shown the capacity to do that. To perform a travelling salesman heuristic for that many people, remembering that exactly ten people with identical requirements in a subject can enter a location simultaneously and that people of different needs should never enter the same location simultaneously at all, and all the other constraints...
It is possible that the problem is too big, that it must be distributed somehow. The internet is a powerful tool for that, but it has to be used correctly.
I have worked with computer aided learning, in the sense of designing it and experimenting with its limits, back in the 90s. It was grossly underutilized, people looked at it only as a book with clickable images and audio. Internet whiteboards, collaborative tools, shared documents - all existed back then. So did multi-way videoconferencing, telerobotics and all kinds of other nifty teaching aids. Almost none were used then.
Today, some of these are used, but the technology has not stood still. Not just data but entire applications can be pushed from machine to machine. Sensors can track hand motions, allowing instructors in music, sculpture, painting or, indeed, archaeology to know precisely what is or is not happening, instant by instant. Simulators can compare expected results with the actual, long before anything is finished. In science, DIY spectrometers can tell chemistry lecturers everything they need to know.
There will be ideas I haven't even stumbled upon. My knowledge is broad, but technology is broader by far.
But these aren't being used. Computer Aided Learning remains 20 years behind the curve at best, 40 years behind at worst. (80 if you include YouTube videos of lectures, Open University was providing that sort of material a long time ago.) If you want a revolution at the level of individuals running the show, that is where to start. You need between quarter to half a century of development to be factored in. That is a lot and inertia is high. If MIT can't be more original than a video camera, an achievement sci-fi conventions could boast of even in the 80s, the people with the knowledge will not adapt to new methods by choice.
Dark chocolate is kind of healthy, and can be organic as well as fairly traded.
I'd suggust the following replacement:
I mean its for kids! In Africa! How better to not be evil, than by activily doing not evil?
If he's stuck on Access, he's more or less stuck on Windows. He'd need some man power to convert those ancient DB files, It would be worth it in the long run, but he might have a tough time convincing managment of that. I'd suggust a skunkworks program done in secrecy over a period of time, then propose the solution when you already have it up and running in a back office.
Thank you for the link. It looks intriguing.
Basically, my assumption has been that you can treat education as being a problem in multi-variable space, that it cannot be reduced until all the variables have been identified together with their relationships and interactions, but that once reduced to the simplest elements, those elements should naturally form a very simple pattern or weave. If my reduction is inaccurate, the weave I have produced will be flawed. Threads will tangle, patterns will become disjoint or incoherent. The same is true if any of those three core assumptions are wrong.
This book you pointed me to, along with any others I find, will definitely give me different perspectives. If my three assumptions are correct, all perspectives should reduce to exactly the same atomic components. If they do not, the idea of atomic components must be wrong.
It is also the case that a different perspective might lead me to reject utterly my entire line of reasoning (won't be the first time, won't be the last) and adopt an entirely new outlook. That can be a very good thing. Never be afraid to learn. I have an interesting mind but far better ones exist for this sort of work. It would be foolish to ignore the ideas of others.
And if it kinda goes along with my thinking? My ideas evolve constantly, even during a post. The very worst that can happen is that I'm inspired and correct mistakes in my thinking. As tragedies go, that seems acceptable.
I look forward to the book, and hope I find many more.
MS Access, Really? I'd like to think those aren't used by anyone for anything serious. I haven't had anyone ask me to do anything with access in a long time. I hope that means they are really dead.
Care to share the Distro of choice on those linux based non chromebook machines? Is it a free employee option ? Are there a set number of pre-approved distros? Is there a top-secret Google Gnu-Linux Distro that dispenses chocolates on the half hour?
I absolutely agree. My idea of having a symmetrical arrangement for speed and creativity is that there will be brilliantly creative people who need a lot of time, and amazingly fast people who have the creativity of a lettuce leaf.
In terms of +/-, because this is 2D, I would describe these as -3 + i and +3 - i.
Now, because everything is done per subject, you can be -3 -i in absolutely everything but basket weaving, where you might be +3 +i. Would you be a success or failure? Broad society would probably say failure, but I can absolutely guarantee you would have a very successful, highly profitable business and international acclaim in the art world. That sounds like a more interesting measure.
Ok, what about those who are truly doomed, negative in every aspect in every subject, learning slowly no matter what you do?
Well, one aspect of streaming is that nobody holds anyone else up, so such people aren't inhibited further by even slower people. In turn, they slow nobody up - a significant cause of classroom disruption that hurts those who are struggling even further. So these people certainly exist, but should fare a lot better.
With customization of style as well as speed, those "not getting it" because the presentation is wrong for them rather than any lack of ability should be running much closer to their natural speed. You have to introduce a third dimension to the streaming to tune the style better, but a mere 3 styles takes us from 15 streams to 45, and the total cost jumps from 2 billion to 6 billion. You might be able to do this - the law of diminishing returns won't kick in until classes (or buckets, if you like queue theory) cannot be kept full OR your research division (the actual end product as far as economists are concerned) have saturated the market, there just isn't any way to absorb the extra products.
Ok, but even when you have siphoned off all those doing badly for extraneous reasons, and got them where they can progress naturally, there will still be people who do badly. Some may even become politicians. The system I have outlined allows any one of these people to change gears. (In fact, it allows anyone to. Shifting down can help avoid burnout, shifting across can help avoid getting into a rut, shifting up can really stretch the mind.) So those who circumvent neural challenges (I'm one, so I know it can be done) can experiment. They can test any or all adjacent streams, without risk. They should be encouraged to, the differences in perspective may help solidify the person's methodology and lead to ever-increasing confidence and ability beyond the genetic baseline. Again, this is true of everyone.
Those who cannot beat the limitations should not, as often happens, be dumped by the system. Those who are slow should be allowed to continue schooling even to degree level or beyond, just at a pace they can manage. Those who have difficulties that mean they cannot learn certain skills at all, ever, at any pace, should nonetheless be encouraged to master what they can, as far as they can. I am stuck on ideas on how to help them further, suggestions are welcome. But what I am proposing is a definite improvement on what they have. Nobody gives up on them, as happens so often today, and because it's ok to suddenly "get it" at any time, nobody feels like they are necessarily marked for failure.
Added to which, you don't need to be a genius to be a lab assistant, and lab assistants are just as entitled to flashes of insight and inspiration. Such insights may lead to even better solutions for struggling minds. After all, rote memorization has little to do with ability. Indeed, very little memorization is needed. Those with poor memories but brilliant idea engines need a way to offload the part they will never be able to do, to nurture and turbocharge the things they are great at.
Ok, this pushes us into computer augmented learning. This isn't a new axis, since these are prosthetic aids that let you take advantage of your strengths. These aids should never be such that natural ability ever decays through under-use, but they should supplement those abilities so that people aren't disadvantaged by the irrelevant. Anyone can - and should - look up facts, because human memory isn't reliable. Memory is useful only as a temporary work buffer to learn skills. So if a computer provided that temporary work buffer, index and, indeed, knowledgebase, it does not reflect one whit on your skills or ability to utilize them.
There is currently no way to plug a memory expansion pack into the brain. It should be possible, though. Once it is, that kind of neurological disadvantage can be eliminated. Would it provide an unfair advantage? No. Because you can train a decision tree with facts, and train a rat brain to respond to stimuli to operate controls, a rat wired to an expert system can pass a traditional exam even though they could do nothing else. (It does prove that we are in a rat race, though.) Ergo, you need an exam that tests current understanding and usage of that understanding.
What is the purpose of an exam? In most schools, it is a barrier to the next level, where parrots and cyber-rats have all the best seats reserved. Here, the purpose is to see if a student is matched up correctly to a stream. There are no quantum leaps, no years in the classic sense, but an approximation to a continuum, enough gear ratios that you can slide smoothly around with minimal repetition and minimal synchronization issues.
Since there are no years, exams cannot be at the end of them. I picture exams as being an educational version of dynamic probes - if a bug in the process is detected, you test to see if the bug can be patched (extra help) or if there is a mismatch between student and environment (ie: stream). The student can then be matched better, re-learning only the bits not quite grasped.
The "final" exam, when the student transfers out of the system (regardless of when) would be intended to translate the level of ability in each subject into terms that can be understood outside. Thus, a person with the knowledge, proficiency, experience and demonstrated skills equal to a doctorate would have a doctorate. Time spent in the system would not matter, nor would a formal designation of being in a doctoral program. The exam would not be "for" something, in the ordinary sense, it would merely establish a level of competence, where the label is decided by what that level is.
Open source is free. Saying anything else is crazy fud talk. Opportunity costs apply to everything you do or use. Only a good faith examination of all technologies strengths and weaknesses will allow you to determine the right solution.
ESR was only looking at the negative side of LInux back in the day. How many people spent time learing linux only to have it lead to a promising career. Far from costing anything for these people, the time spent setting up Linux was money *earned*.
Or it was a terrible misquote of him in the slashdot summary.
His real quote was
“The current system for Universal Credit is a conventional system being developed on a waterfall approach. When you look at digital [the enhanced system], it’s very different – it relies not on large amounts of tin, black boxes, but uses open source and mechanisms on the web to store and access data,” Shiplee told MPs.
When asked why he didn’t adopt this approach two and a half years ago at the start of the project, Shiplee said: “Technology is moving very rapidly, such things weren’t available as they are today.”
So he might not have meant that opensource wasn't availible, but that the" mechanisims on the web to store and access data" weren't *as* available as they are today. Without knowing what technologies he's using, he could be right. They might not have existed, or have been as mature as they are now.
This is clearly another case of too many mad scientists, and not enough hunchbacks.