This claim is absurd.
This claim is absurd.
No, for any given statement, either it or its negation is true.
Were that not the case, you still would have no way to determine whether an objective basis exists, yet unidentified.
It is not the case that all facts can be quantitatively measured.
It is not the case that all opinions are merely qualitative.
You are confusing your definitions with metaphysical truisms. But, welcome to epistemology, and why Logical Positivism is insufficient, according to either of us!
Wrong. Science requires faith in quite a few unprovable axioms, right at its core. Identity, that things are what they are, and are so consistently, being one.
No science proceeds without starting with hypotheses, the plausibility of such ultimately being true being supported, at that point, only by the equivalent of faith. To avoid the common misrepresentation, "faith" does not mean "belief without evidence", that's simply an intentionally-false statement of what theists mean by it, made by atheists, to fit a pre-built argument. "Confidence in the face of incomplete information" is an accurate rendering of what theists actually mean.
That's an opinion. The theist makes a factual claim.
Either "rock is good" or "rock is not good" is a factual claim. One or the other is true, neither is provable.
"The theist" can certainly hold the position that his belief is one of opinion, rather than fact. That is in reality the predominate stance.
The former says "since this is what science can investigate, we will proceed with that stipulation". The latter is the stance that only things science can act upon (e.g. physical matter) exist.
It's an important distinction, and well-worth reading up on in terms of the nuances--googling either term will yield plenty of material.
A few points here.
Occam was theist. As the best possible implementor of correct application of Occam's Razor, theism was his conclusion.
Occam's Razor says absolutely nothing about likelihood.
Occam's Razor states that the simplest model should be used, -all else being equal-, for the purposes of conceptual economy. That is the only correct inference to what "winning" per Occam's Razor means.
Any difference in evidence whatsoever invalidates the use of Occam's Razor in selecting between two or more models. I assume you consider your model to have greater evidence.
Just to note, it is your position that is "extraordinary" by reason of the fact theism is the significant majority and so, that's the "ordinary".
"Extraordinary" does not mean "things I personally find really improbable".
Nice post. Reasoned, articulate. Precisely the reason you were downvoted by atheists who know this in their very own minds as they downvote you. Welcome to Slashdot.
My logic is generally very good. I presume, for this case, you need no evidence for your claim otherwise.
And no, I do not "need" to do anything. You have no significance to anything, nor any possible theoretical significance. You are a Random Guy on the Internet, and your worldview allows for nothing more.
As a couple of questions you should ask yourself, though, is which genre of music is provably the best? Which political stance is provably best? What things -cannot- happen (things one might consider "magic" or "miraculous") given the permutations of possibility of quantum behavior? You have an extremely weak understanding of even the basics of epistemology, or physics for that matter.
No, this is completely false.
Peer-reviewed, supporting in its evidence a rather restricted subset of religion's whose after-death predictions are what is experienced.
Strong relative differences in success at future prediction again subset which are most plausible.
You will claim this is "not evidence". It will remain evidence after you do so.
As for airbrushing, there was an interesting story run by the BBC a month or two back. Apparently Queen Elizabeth I signed into law the right for theatrical companies to kidnap children off the streets without limit or redress in law. It was apparently boasted that not even the children of nobility were safe.
This isn't the sort of thing that gets a lot of mention in productions set in that period.
Norse-style grave goods were also still deposited at the time, apparently some Londoners considered themselves the true descendants. Probably a bit like modern Wiccans, except with big, sharp pointy swords and large quantities of booze.
Other cults doubtless existed before, then and after.
As for medicines, the oldest actual medical book (not a reprint or modernized version) is from 1772. It isn't old enough to get a feel for really old cures, but it is certainly old enough to get a feel for what people were trying. Hemlock (as a drink and a poultice) is well-known, as was the use of mercury. Tea made from ground ivy doesn't sound good and the treatment that calls for the patient to down half a dram of antimony probably wasn't one that encouraged people to call on the doctor very often.
Fireworks listed in a book from 1776 could not be used today - the compounds and gasses produced are too carcinogenic. Companies hiring pyrotechnic experts back then presumably didn't have to worry about sick leave or pensions.
Head lice, tropical diseases brought in by sailors, personal zoos of deadly animals, cholera epidemics, the occasional plague, religious extremists... No, wait, that's modern life, isn't it? When was the Children's Crusade, anyway?
I'm working on it. Seriously. They dug up an Iron Age settlement not far from where I used to live. My father was one of the scientists on the team, providing magnetometry, ground penetrating radar and mass spectrometry. One of his colleagues from the university provided geological analysis and a colleague from another university handled conservation of things like amber artefacts.
I have all that data (not all of which was released to the public) plus the archaeological reports and a decade of photographs of the site from over a dozen people, plus photographs of very nearby contemporary monuments.
My plan, from the very start, has been to turn this into something educational. My thought has been to construct a virtual reality, much like Virtual Rome (dunno if that is still being run), so that people could see a reasonable reconstruction of what the site looked like - not just in one era, but in each of the eras for which sufficient data exists.
Admittedly, this has turned out to be a very difficult project - well beyond my artistic skills and a very tough challenge for the virtual reality software that has been open sourced. Any help - any at all - would be gratefully received. I tried to get kickstarter funding, to hire the necessary talent, but kickstarter rejected the project outright as too freakish.
If that project fails, I am looking to see what other educational uses I can put the data to.
First thing one should focus on to learn reason is logical fallacies, and the False Dichotomy, for example, "Reason versus religion", is right up toward the top.
What Dawkins et al are selling isn't reason, it's Logical Positivism, which has rather thoroughly run aground as of about 30 years ago. Not all questions are resolvable by empiricism and scientific method. Epistemology is far wider than that. Is rock music good? Prove it.
I'll get into the Reification Fallacy, that "not-X" is not something, it is nothing, regardless of what "X" is--including theism--another day.
I would concur with that. What I have proposed is at the upper limits* of what can be achieved by a single entity running a single entity. If you need finer granularity, more dimensions or greater timelines to give everyone a fair chance in life, no single entity (corporate or government) could do it.
*It may actually be beyond. Not financially, but organizationally. To predict the optimum path for each student individually, track that, and correct at a moment's notice, no entity has shown the capacity to do that. To perform a travelling salesman heuristic for that many people, remembering that exactly ten people with identical requirements in a subject can enter a location simultaneously and that people of different needs should never enter the same location simultaneously at all, and all the other constraints...
It is possible that the problem is too big, that it must be distributed somehow. The internet is a powerful tool for that, but it has to be used correctly.
I have worked with computer aided learning, in the sense of designing it and experimenting with its limits, back in the 90s. It was grossly underutilized, people looked at it only as a book with clickable images and audio. Internet whiteboards, collaborative tools, shared documents - all existed back then. So did multi-way videoconferencing, telerobotics and all kinds of other nifty teaching aids. Almost none were used then.
Today, some of these are used, but the technology has not stood still. Not just data but entire applications can be pushed from machine to machine. Sensors can track hand motions, allowing instructors in music, sculpture, painting or, indeed, archaeology to know precisely what is or is not happening, instant by instant. Simulators can compare expected results with the actual, long before anything is finished. In science, DIY spectrometers can tell chemistry lecturers everything they need to know.
There will be ideas I haven't even stumbled upon. My knowledge is broad, but technology is broader by far.
But these aren't being used. Computer Aided Learning remains 20 years behind the curve at best, 40 years behind at worst. (80 if you include YouTube videos of lectures, Open University was providing that sort of material a long time ago.) If you want a revolution at the level of individuals running the show, that is where to start. You need between quarter to half a century of development to be factored in. That is a lot and inertia is high. If MIT can't be more original than a video camera, an achievement sci-fi conventions could boast of even in the 80s, the people with the knowledge will not adapt to new methods by choice.
Thank you for the link. It looks intriguing.
Basically, my assumption has been that you can treat education as being a problem in multi-variable space, that it cannot be reduced until all the variables have been identified together with their relationships and interactions, but that once reduced to the simplest elements, those elements should naturally form a very simple pattern or weave. If my reduction is inaccurate, the weave I have produced will be flawed. Threads will tangle, patterns will become disjoint or incoherent. The same is true if any of those three core assumptions are wrong.
This book you pointed me to, along with any others I find, will definitely give me different perspectives. If my three assumptions are correct, all perspectives should reduce to exactly the same atomic components. If they do not, the idea of atomic components must be wrong.
It is also the case that a different perspective might lead me to reject utterly my entire line of reasoning (won't be the first time, won't be the last) and adopt an entirely new outlook. That can be a very good thing. Never be afraid to learn. I have an interesting mind but far better ones exist for this sort of work. It would be foolish to ignore the ideas of others.
And if it kinda goes along with my thinking? My ideas evolve constantly, even during a post. The very worst that can happen is that I'm inspired and correct mistakes in my thinking. As tragedies go, that seems acceptable.
I look forward to the book, and hope I find many more.
I absolutely agree. My idea of having a symmetrical arrangement for speed and creativity is that there will be brilliantly creative people who need a lot of time, and amazingly fast people who have the creativity of a lettuce leaf.
In terms of +/-, because this is 2D, I would describe these as -3 + i and +3 - i.
Now, because everything is done per subject, you can be -3 -i in absolutely everything but basket weaving, where you might be +3 +i. Would you be a success or failure? Broad society would probably say failure, but I can absolutely guarantee you would have a very successful, highly profitable business and international acclaim in the art world. That sounds like a more interesting measure.
Ok, what about those who are truly doomed, negative in every aspect in every subject, learning slowly no matter what you do?
Well, one aspect of streaming is that nobody holds anyone else up, so such people aren't inhibited further by even slower people. In turn, they slow nobody up - a significant cause of classroom disruption that hurts those who are struggling even further. So these people certainly exist, but should fare a lot better.
With customization of style as well as speed, those "not getting it" because the presentation is wrong for them rather than any lack of ability should be running much closer to their natural speed. You have to introduce a third dimension to the streaming to tune the style better, but a mere 3 styles takes us from 15 streams to 45, and the total cost jumps from 2 billion to 6 billion. You might be able to do this - the law of diminishing returns won't kick in until classes (or buckets, if you like queue theory) cannot be kept full OR your research division (the actual end product as far as economists are concerned) have saturated the market, there just isn't any way to absorb the extra products.
Ok, but even when you have siphoned off all those doing badly for extraneous reasons, and got them where they can progress naturally, there will still be people who do badly. Some may even become politicians. The system I have outlined allows any one of these people to change gears. (In fact, it allows anyone to. Shifting down can help avoid burnout, shifting across can help avoid getting into a rut, shifting up can really stretch the mind.) So those who circumvent neural challenges (I'm one, so I know it can be done) can experiment. They can test any or all adjacent streams, without risk. They should be encouraged to, the differences in perspective may help solidify the person's methodology and lead to ever-increasing confidence and ability beyond the genetic baseline. Again, this is true of everyone.
Those who cannot beat the limitations should not, as often happens, be dumped by the system. Those who are slow should be allowed to continue schooling even to degree level or beyond, just at a pace they can manage. Those who have difficulties that mean they cannot learn certain skills at all, ever, at any pace, should nonetheless be encouraged to master what they can, as far as they can. I am stuck on ideas on how to help them further, suggestions are welcome. But what I am proposing is a definite improvement on what they have. Nobody gives up on them, as happens so often today, and because it's ok to suddenly "get it" at any time, nobody feels like they are necessarily marked for failure.
Added to which, you don't need to be a genius to be a lab assistant, and lab assistants are just as entitled to flashes of insight and inspiration. Such insights may lead to even better solutions for struggling minds. After all, rote memorization has little to do with ability. Indeed, very little memorization is needed. Those with poor memories but brilliant idea engines need a way to offload the part they will never be able to do, to nurture and turbocharge the things they are great at.
Ok, this pushes us into computer augmented learning. This isn't a new axis, since these are prosthetic aids that let you take advantage of your strengths. These aids should never be such that natural ability ever decays through under-use, but they should supplement those abilities so that people aren't disadvantaged by the irrelevant. Anyone can - and should - look up facts, because human memory isn't reliable. Memory is useful only as a temporary work buffer to learn skills. So if a computer provided that temporary work buffer, index and, indeed, knowledgebase, it does not reflect one whit on your skills or ability to utilize them.
There is currently no way to plug a memory expansion pack into the brain. It should be possible, though. Once it is, that kind of neurological disadvantage can be eliminated. Would it provide an unfair advantage? No. Because you can train a decision tree with facts, and train a rat brain to respond to stimuli to operate controls, a rat wired to an expert system can pass a traditional exam even though they could do nothing else. (It does prove that we are in a rat race, though.) Ergo, you need an exam that tests current understanding and usage of that understanding.
What is the purpose of an exam? In most schools, it is a barrier to the next level, where parrots and cyber-rats have all the best seats reserved. Here, the purpose is to see if a student is matched up correctly to a stream. There are no quantum leaps, no years in the classic sense, but an approximation to a continuum, enough gear ratios that you can slide smoothly around with minimal repetition and minimal synchronization issues.
Since there are no years, exams cannot be at the end of them. I picture exams as being an educational version of dynamic probes - if a bug in the process is detected, you test to see if the bug can be patched (extra help) or if there is a mismatch between student and environment (ie: stream). The student can then be matched better, re-learning only the bits not quite grasped.
The "final" exam, when the student transfers out of the system (regardless of when) would be intended to translate the level of ability in each subject into terms that can be understood outside. Thus, a person with the knowledge, proficiency, experience and demonstrated skills equal to a doctorate would have a doctorate. Time spent in the system would not matter, nor would a formal designation of being in a doctoral program. The exam would not be "for" something, in the ordinary sense, it would merely establish a level of competence, where the label is decided by what that level is.