Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Scholar of open access estimates up to 96% savings (Score 2) 63

If more of scholarship turned toward open access, libraries could shift money from paying for subscriptions to supporting journals or journal mirrors. They'd likely save considerable cash doing so.

Heather Morrison, a colleague of mine, researched this. She estimated savings as high as 96%. The details are in her dissertation, Freedom for scholarship in the internet age - which is, of course, open access. The cost estimates are on page 86 (the 98th page of the PDF).

Comment Unnecessary binary forks the tool base (Score 1) 566

I agree that it's a stupid idea, but there is almost nothing about text that makes it special. If the HTTP/2.0 standard is actually a standard, then it will be pretty easy to make an app or a plugin that translates it.

There's almost nothing about English that makes it special, but I wouldn't recommend documenting the spec in Klingon. (To be fair, you agree that binary is a bad idea.) Or hey, would could use APL for the reference implementation!

Intrinsically, you are correct: there's nothing much special about text. But the significance of text isn't intrinsic: it is extrinsic. Text is special because it's standard. We have a rich selection and history of tools, techniques, idioms and sober experience to draw on when it comes to dealing with text. With a new binary format we would have none of that. Your imagined translator has a lot of work to do before it can match up to the existing capabilities of text-oriented tools, let alone exceed them. Unnecessary binary formats effectively fork the development infrastructure.

If you ask me, it makes about as much sense as replacing the Roman alphabet with Chinese idiograms for English. Chinese characters are for more information dense: you can fit more on a page and you can read it faster. In many respects it's more efficient (it has even proven effective as a shared written form for diverse spoken dialects). That doesn't make it a good idea, regardless of the availability of translators.

We have plenty of historical examples of text protocols providing advantages over their binary equivalents. Do we have any good examples of the opposite: in which a binary protocol replaced a text protocol and proved superior by virtue of being binary?

Comment Some of her words and his (Score 5, Informative) 666

From her blog (her post is long and detailed):

I don’t want to write this. I don’t want to get caught up in anything to do with this women in infosec bit. Everyone who does get lambasted so badly at this point I’d rather avoid it entirely. You can’t say anything about sexism without getting lumped in with the creeper cards or the talk canceling at Bsides SF. . . . I’m bogged down in book edits. I’m teaching a lot of new classes this summer and fall. Needless to say, I don’t have time to process this much less write about it. Plus I’ve gotten enough pushback already. People I thought were my friends and colleagues have said things to me about this that have cut deeper than the actual assault ever could. I don’t want to deal with more of that. I don’t want to see the comments for this post. But I feel like I have to do this. I weighed my options. If I shut up and do nothing and later hear he did this to someone else, I will feel personally responsible. I have to do everything I can to make sure another speaker or attendee doesn’t get worse than I got.

This wasn’t like any of those grey areas that make anybody question the validity of any rape claim. . . . . We talk for a little bit about nothing consequential. Guy jumps on me and pins me down. . . . Perhaps I was not making myself clear, “No!” “Stop!” “I don’t want to do this!” . . . Once he had my pants down and his pants down and was completely ignoring my shouting for him to stop, it suddenly became clear to me what was about to go down. He was holding my arms down of course, so I leaned up and bit him on the arm as hard as I could, at which point he started swearing and punched me in the face. . . . I managed to lunge up towards the table and grab hold of a coffee cup. I knew I only had one shot. So I hit him with everything I had, and I got him right in the temple. And guess what, he let me go.

This is the last thing I have to say about all this. My duty is done. I don’t want to be the poster girl for infosec feminism. I want to be a researcher, and a trainer, and a speaker, and an icon.

From his blog (he wrote very little):

It was brought to my attention a recent flood of Twitter messages containing a number of accusations (ranging from "horrible", to "very horrible") against my person. The accusations were originated by someone who happened to be a speaker at the same Conference . . . and, for reasons that I didn't and don't understand, has been repeating blatant lies, every time magnifying it a bit more -- which nobody in their right mind could believe. . . . think about events that happened in the last decade based on "assumptions", or the kind of anti-humanitarian scenarios this world has experienced simply because some mentally-disordered person came up with a blatant lie that everyone followed with questioning. I will personally not contribute to the existing drama, since it someone else's game to get attention at any price.

What disturbs me here is the knee-jerk suggestion that she invented the story for some unspecified reason. Statistically, only a very small number of rape accusations turn out to be fabricated. Of course I don't know for sure what happened. I've never even heard of these people before. But based on the little evidence I have seen, I know who I believe.

Submission + - The MOOCs Continue, This Time in SciFi/Fantasy Writing.... 3

An anonymous reader writes: Inexplicably, the MOOC era shows no signs of abating. Beginning June 3 two MOOCs in Science Fiction and Fantasy will begin. The first, coming from well known MOOC provider Coursera, will be taught by University of Michigan professor Eric Rabkin, and will focus on a historical and psychological analysis of the genre, while the second will come from the university creative writing class of NYT bestselling author Brandon Sanderson, best known for his completion of the Wheel of Time book franchise. If this trend keeps up, maybe we can cross our fingers for a MOOC on screen writing from Joss Whedon soon...

Comment I went the other way, OS X - Linux (Score 5, Interesting) 815

I went the other way about two and a half years ago. I'm sure someone will tell me I was doing it wrong; I wouldn't be surprised if they're right. But I found the FOSS package managers for OS X incredibly painful to work with. I remember it taking at least a day of mucking around with compiling and pre-built binaries just to get the tools I needed for web development. It took me ten minutes to get the same thing working in Ubuntu.

Still, there were plenty of headaches: sleep mode, hybrid graphics and synaptics. Even though I had been avoiding dependence on proprietary software since activation chased me away from Windows, I had to give up really useful Mac tools like Scrivener, Tinderbox and Screen Flow (I still boot the Mac when I need to do a screencast). I used to be a programmer. Now I'm a social scientist. These days I do mostly reading and writing, not programming; the loss of Scrivener was a hard blow. I smoothed the way by writing my own tool.

OS X was significantly better for all but the most ordinary end-user applications. My area of research is the online commons - copyright, FOSS, creative commons - stuff like that. I could make my peace with Apple when they were only a pipsqueak tyrant. When they released the iPad and it was locked down, I simply couldn't stomach it anymore: and I was tying myself to an ecosystem that could be progressively enclosed by Apple. A friend of mine - a social scientist, not a programmer - switched to Mint, proving it was finally doable. Also, XMonad is pretty cool, and my search for a decent editor finally led me to take vim seriously.

Linux isn't perfect, but it's come a long way since I first used it for development in 1993. It really is usable - and sometimes excellent - for everyday work. Using a platform is supporting that platform. I wouldn't tell anyone else what to do, but I'm content to use this one.

Comment Space != place, e.g. mathematical spaces (Score 3, Insightful) 292

Lind treats countries and legal jurisdictions as "real," but says "there is no such place as cyberspace." It's a specious argument. A space is not the same as a place. I don't think mathematicians confuse the spaces they talk about with places. Geographer Manuel Castells, in his influential book series The Information Age, wrote about the conflict between "space of flows" - of networked finance, data and communication - and contrasted it with the "space of places" - physical locations where people live.

To be fair, Lind seems to be arguing tha cyberspace is not like a physical territory. The metaphor of cyberspace, by implying that it is, supports misleading conclusions. This is a reasonable argument. Metaphors are useful for description, but they are not predictive - though many people, journalists among them, take them too far. Lind is right that governments already have jursidiction over people acting on the Internet, though as others Slashdot commenters point out the Internet has raised numerous jurisdictional questions. However, we could not name or understand anything new if we did not compare it with something already known. I don't think the imperfection of a metaphor is sufficient grounds for discarding it.

But the motivation of the argument, it seems to me, is political. He writes: "it makes no sense to say that California and the U.S. are extending their jurisdiction 'into' cyberspace . . . The idea that corporations are 'invading' a mythical Oz-like kingdom called cyberspace is . . . dopey." I don't know about that. Scholars often use the language of colonization in cases like this. We could talk about government "invasion" into private or family space. Canada's then-justice minister Pierre Trudeau used a spatial metaphor when he said in 1967, upon the introduction of a bill to decriminalize homosexuality, "there's no place for the state in the bedrooms of the nation."

. . . try substituting “fax” or “telephone” or “telegraph” for “cyber” in words and sentences. The results will be comical. “Activists denounced government criminal surveillance policies for colonizing Fax Space.” “Should Telephone Space be commercialized?” . . . the point is not that telecommunications should not be structured and governed in the public interest, but rather that the debate about the public interest is not well served by the Land of Oz metaphor.

He takes it for granted that these comparisons are reasonable. I don't think they are. I don't hear anyone talking about a "fax" community or a "telephone" community. But people do talk about an "Internet community," an "Internet generation" (more questionable in my mind) - even of belonging to an online "tribe". The Aaron Swartz memorial site is full of such statements. Note also the big-I: the Internet is a proper noun (even he spells it that way).

Benedict Anderson wrote Imagined Communities about the formation of new nations, such as Indonesia, following decolonization. He explains how arbitrary colonial lines drawn on a map gave rise to real feelings of national identity among people who did not know each other: but who had a sense that they had shared histories and experiences. I think the Internet has produced some similar feelings. I don't think Lind's argument is entirely clear, but it seems to me this is what he is really arguing against. But that's a judgment of what should be. Disguising it as an objective description of the Internet is problematic.

Or hey, just read China Mieville's The City and the City.

Comment That is not the only legitimate approach (Score 2) 175

I have argued before that this is only one kind of civil disobedience. The context of MLK's quote and actions is important: he is laying out a strategy and criticizing the actions of his opponents.

In no sense do I advocate evading or defying the law, as would the rabid segregationist. That would lead to anarchy. One who breaks an unjust law must do so openly, lovingly, and with a willingness to accept the penalty. I submit that an individual who breaks a law that conscience tells him is unjust, and who willingly accepts the penalty of imprisonment in order to arouse the conscience of the community over its injustice, is in reality expressing the highest respect for law.

He's not kidding about anarchy. The torture and lynchings carried out by rabid segregationists were truly barbaric. The Civil Rights movement depended on the defense of law. While protesters fought local and state laws, they appealed to friendly rulings from the Supreme Court. Their aim was to draw in the federal government to affirm the existing legal rights of blacks. The quote you have chosen, and indeed the letter as a whole, is an effort to walk a delicate line, defending the right to civil disobedience while reaffirming respect for the laws that the movement depended on for success.

You can't simply take this quote out of context and treat it as a universal claim about all law-breaking. In the same letter he gives the example of the Boston Tea Party: but the people who participated in that event disguised themselves to avoid being caught. Then there's this:

We should never forget that everything Adolf Hitler did in Germany was "legal" and everything the Hungarian freedom fighters did in Hungary was "illegal." It was "illegal" to aid and comfort a Jew in Hitler's Germany.

The freedom fighters in Hungary in 1956 actually fought. People died. Nor do I think for a moment that MLK would say all resistance to Nazi laws must be open, loving and done with a willingness to accept the consequences. More relevant to Civil Rights, of course, was the previous history of slavery. No-one on the Underground Railroad broke the law openly. When escaping slaves got cold feet, Harriet Tubman would force them to continue at gunpoint lest they reveal the identities of others. Not only was such lawbreaking justified: I would suggest that inaction in the face of such great injustice was wrong.

The matter of civil disobedience cannot be resolved without considering context. Is the tactic effective? Is it likely to produce bad outcomes (e.g. anarchy)? Is the law just in its intent, its consequences, and its application? Is it politically legitimate? Are there better alternatives for opposing it? MLK chose what he believed was the most lawful way to achieve a just end. Looking at the state of copyright law and politics in the U.S. and internationally (Swartz's manifesto explicitly discusses access to knowledge and the developing world) and the outcomes (his actions were hardly likely to provoke anarchy), I think Swartz may have done likewise.

I have written more previously, which I won't repeat here.

Comment This is about defining/defending "the profession" (Score 3, Insightful) 248

This isn't really about Al-Khabez. It's about policing the boundaries of the profession. The problem - the reason that there is a culture clash - is that despite attempts for over 40 years, no-one has succeeded in transforming computer programming into a profession. To be more precise, whether programmers professionalized remains a serious question for debate.

Look at the quotes from Simonelis, Dawson, and the ACM:

behavior that is unacceptable in a computing professional (Simonelis)

no longer suited for the profession (Dawson)

The Code and its supplemented Guidelines are intended to serve as a basis for ethical decision making in the conduct of professional work. Secondarily, they may serve as a basis for judging the merit of a formal complaint pertaining to violation of professional ethical standards. (ACM code of ethics)

If programming were a profession like medicine or law or engineering, programmers would acquire higher status, as would organizations like the ACM. From the point of view of managers, programmers are often seen as unmanageable crafts people with little respect for standard practices of business. For them, professionalization is about controlling and assessing programmers and theirwork. The rise of computer science, the creation of software engineering, and the creation of the ACM were all driven in large part by efforts to professionalize the field: sometimes more in the interests of programmers, sometimes more in the interests of management

This comes up again and again on Slashdot. Should there be a standard curriculum or test or other criteria that all programmers should meet? Should we have to belong to professional associations? Should programmers be obliged to follow codes or take legal responsibility for flaws in software? How much should formal education and credentials be valued? Should self-taught programmers be excluded?

These are contentious issues. Clearly Dawson College and Mr Simonelis have an interest in defining and policing the boundaries of the profession. This would enhance their status. But as nearly a half century of debate and ongoing discussion here demonstrate, there is no professional consensus for them to uphold. This is real cultural divide. Al-Khabez got caught in the middle, used by Dawson in their efforts to define the profession and their own status. I think that's terribly unfortunate.

For an excellent book on the history of programming and efforts to professionalize it, see The Computer Boys Take Over by Nathan Ensmenger. He argues that programmers are morke like technicians than professionals. Like other technicians, their work is often threatening to the organizations that depend on them. And despite the best attempts of computer science and software engineering, much of it is guided more by craft principles than by rigorous scientific or engineering methods.

Comment Mod parent up: no need to accept punishment (Score 1) 243

Many people argue that if you break the law on principle and don't accept the punishment you're doing it wrong. This is incorrect. There are, as you so nicely illustrate, other approaches. Indeed, the reason for acquiescing to punishment is precisely to highlight the law's abuse! Arguing that protest is unethical if it does not accept punishment is a neat trick. In effect, it is often little different from arguing that the law is right because it is the law.

The problem is that the American civil rights movement has been taken a standard for protest. But it was an unusual case. The protesters knew that they were in fact acting in accordance with their legal rights, and could appeal to the federal government for support. This is hardly a universal illustration of how to defy the law.

The strategy of the civil rights movement began with a legal agenda pursued by the National Association for the Advancement of Colored People (NAACP), resulting in a number of Supreme Court decisions in the 1940s and 1950s affirming the civil rights of African Americans. Activists then attempted to nonviolently assert those rights, knowing that segregationists would respond with violence. The ensuing crisis would compel the federal government to enforce rights upheld by the courts.

The other standard for civil disobedience is Gandhi. But like the civil rights movement, he used it because it was an effective tactic:

. . . where there is only a choice between cowardice and violence, I would advise violence. . . . I would rather have India resort to arms in order to defend her honour than that she should, in a cowardly manner, become or remain a helpless witness to her own dishonor. But I believe that nonviolence is infinitely superior to violence, forgiveness is more manly than punishment.

The underground railroad is an example that makes clear that the ideal of submitting to punishment can be antithetical to principled, legitimate defiance of the law. On the Volokh Conspiracy, a commenter named Mark Nelson makes the point:

I'm rather confused by the widespread misconception (repeated here) that civil disobedience is primarily about being willing to serve jail time. That's one possible tactic, to draw attention to a cause and the injustice of a law by being arrested amidst much publicity. But it's not the only or historically the main tactic. Another major reason for civil disobedience is to render a law unenforceable by flouting it. That may (depending on the person/situation) be intended either to eventually get the law changed by demonstrating to the public that it's manifestly unenforceable, or simply to directly circumvent it, effectively nullifying it whether it gets repealed or not.

The tactic can actually be enhanced by not being caught in some cases. One famous American example: the Boston Tea Party was an act of civil disobedience performed by people who took some care to ensure they would not be caught. It was mostly an act of symbolic politics, but did not involve anyone getting arrested as part of the symbolism: they disguised themselves and escaped with impunity. Anon Y. Mous also mentions the Underground Railroad, another prominent example of civil disobedience explicitly aimed at violating the law without being caught, in that case of the direct-circumvention variety.

In Swartz's case, the goal was simply to release academic papers to the public, producing an actual "fact on the ground", not to make a symbolic protest against intellectual property by going to jail.

The idea that one cannot legitimately protest the law without suffering for it is an oddly puritanical myth that needs to be debunked.

Comment You misunderstand the article (Score 4, Insightful) 265

the article complains at great length that the social sciences are a mistake: they're really veiled branches of philosophy

The article says no such thing:

Value judgments are always at the core of the social sciences. “In the end,” wrote Irving Kristol, “the only authentic criterion for judging any economic or political system, or any set of social institutions, is this: what kind of people emerge from them?” And precisely because we differ on what kind of people should emerge from our institutions, our scientific judgments about them are inevitably tied to our value commitments. But this is not to say that those values, or the scientific work that rests on them, cannot be publicly debated according to recognized standards. . . .

The lasting value of Kuhn’s thesis in The Structure of Scientific Revolutions is that it reminds us that any science, however apparently purified of the taint of philosophical speculation, is nevertheless embedded in a philosophical framework — and that the great success of physics and biology is due not to their actual independence from philosophy but rather to physicists’ and biologists’ dismissal of it.

In other words, physics and biology sciences, just like social science, are reliant on philosophy: but there normal functioning - what Kuhn calls "normal science" - depends on them disregarding this dependence. But when a crisis is reached, philosophy becomes central. (I had to read that and the following text a few times to appreciate the important distinction between independence and dismissal.)

Here is Kuhn in the book itself, explaining why competing paradigms are incommensurable. Arguing agains Popper's idea of falsification, his point is that scientific method cannot provide a foolproof method for deciding between them:

No process yet disclosed by the historical study of scientific development at all resembles the methodological stereotype of falsification in direct comparison with nature. . . . anomalous experiences may not be identified with falsifying ones. Indeed, I doubt that the latter exist. As has repeatedly been emphasized before, no theory ever solves all the puzzles with which it is confronted at a given time; nor are the solutions already achieved often perfect. On the contrary, it is just the incompleteness and imperfection of the existing data-theory fit that, at any time, define many of the puzzles that characterize normal science. If any and every failure to fit were ground fo theory rejection, all theories ought to be rejected at all times. On the other hand, if only severe failure to fit justifies theory rejection, then the Popperians will require some criterion of “improbability” or of “degree of falsification”.

(Frankly, this is probably a little unfair. Perhaps no falsifying test can be absolutely perfect, but some can come awfully close.) Ultimately, when a paradigm shift takes place it can only be resolved through consensus, not scientific objectivity. Thus the character of a scientific community is central to his inquiry and his theory:

. . . the choice between . . . competing paradigms proves to be a choice between incompatible modes of community life. Because it has that character, the choice is not and cannot be determined merely by the evaluative procedures characteristic of normal science, for those depende in part upon a particular paradigm, and that paradigm is at issue. . . . Each group uses its own paradigm to argue in that paradigm's defense.

The philosopher Juergen Habermas has explored the nature of science also. He argues that the scientific questions are decided on the basis of evidence: but that no objective method can determine what counts as evidence. It is the consensus of the community of scientists that makes this judgement. Thus the fundamental basis for science is not scientific. Look at climate science: the existence consensus is not incidental to the question of climate change: it is an essential part of how science actually works. Just as science produces theories, not Truths, it is dependent on consensus about measurements, not objective facts.

On this basis, social sciences are science. They are not thinly-veiled philosophy: they are empirical. Kuhn suggests, however, that they lack paradigms and are therefore different from the natural sciences. One of the dangers pointed out particularly by the article and by Popper in particular is that many social scientists have claimed scientific objectivity akin to that of physics or biology, ignoring the philosophical foundations of their fields. In that sense you are right (just look at the current state of neoclassical economics - though it's clear that is a field subscribing to a very influential paradigm). But the social sciences are not exceptional in this - all sciences have philosophy at their base. Unfortunately, in the rush to extreme postmodernism this was taken by many mean that science is just another way of knowing with no special claim to knowledge. This extreme interpretation has abated, however; I suspect climate change is one major reason why.

Comment Article is biased (Score 0) 201

The Liebowitz and Margolis article only considers typing speed. On that basis, it finds a lack of evidence that Dvorak is significantly faster, and substantial evidence that it is only slightly faster (on the order of 5%). More importantly, the article claims that the costs of switching would likely wipe out any gains:

There are several versions of the claim that a switch to Dvorak would not be worthwhile. The strongest. which we do not make, is that Qwerty is proven to be the best imaginable keyboard. Neither can we claim that Dvorak is proven to be inferior to Qwerty. Our claim is that there is no scientifically acceptable evidence that Dvorak offers any real advantage over Qwerty.

However, the article makes no mention of accuracy or repetitive strain. It does claim that Dvorak typists move their fingers shorter distances, which would seem likely to reduce strain. In the absence of anything more substantial, I'll fall back on personal experience.

I switched from Qwerty to Dvorak 20 years ago on a bet, and have typed Dvorak ever since. I agree with the article's assessment that it isn't a whole lot faster, probably less than 10%. It's probably also slightly more accurate, but I'm really not sure. However, I am convinced that it is much easier on the fingers. I simply don't suffer from the strain I used to with Qwerty. When I Have had to be bilingual, as it were, at a client site (sometimes for weeks at a time), I have recovered my speed with Qwerty - and the increased strain along withi it.

Liebowitz and Margolis's article is motivated by an economic argument that market entrepreneurs will tend to converge on superior technologies and standards. I am not an economist: but I am a social scientist with some expertise in how innovation is socially shaped, and I don't buy their larger argument. As a scholar, I would point to Trevor Pinch and Weibe Bijker's classic work on the development of the bicycle, and philosopher Andrew Feenberg's assessment that technologies do not succeed because they are efficient: they are efficient because they succeed. One of the best examples of this that I know of is the IBM PC, which even as it took over the market was in many ways technically inferior to its competition.

A big problem I see with the Liebowitz and Margolis argument is that they assume typing speed is the measure of technical superiority. In reality, technical debates are often all about which criteria are relevant. It may well have been that when Qwerty and Dvorak were developed market actors also took for granted speed was the correct criterion. But this is precisely the kind of assumption that locks technology into path dependence. Is it more important to maximize speed, or to minimize stress and injury? There is no single objective answer to such questions. One can only claim market efficiency by assuming an answer. Saying "the aggregate choices of market actors decide" is circular logic that avoids the issue - in which case, the evidence Liebowitz and Morgolis present about speed is irrelevant anyway.

Comment Actual data: wage disparity is real (Score 5, Informative) 467

The key is comparing apples to apples i.e. not just comparing people doing the same job, but comparing people with the same number of years of full time experience of comparable quality.

A study that took into account education, hours worked, and skill into account found that:

Earnings are a function of skill and effort as well as gender. But even after we control for these factors, a relatively large earnings gap between men and women remains. The gender wage gap across the major creative class occupations ranges from $20,000-plus on the high end ($23,400 for management, $24,300 for law, and $26,600 for healthcare occupations ), to around $8,000-$10,000 on the low end ($8,700 for education, $9,800 for life, physical, and social science, and $9,900 for architecture and engineering).

Keep in mind that skill is not entirely an independent variable. People who are promoted to more resonsible positions have the opportunity to learn from the experience, whereas those who are not promoted don't. In other words, the effects of bias are likely to compound.

So the statistics above may understate the problem. The unadjusted numbers are truly horrendous. For law, men get paid more than twice as much ($138k vs $66k), which seems dramatically out of proportion to slightly more schooling (17.5 years vs 15.6 years) and a significant but not huge gap in hours worked (46.6 vs 40.9 hours - I don't know about you, but I personally find a dramatic drop-off in marginal productivity as hours increase).

Notice also the gap in education. Some comments here are suggesting that education is a domain of reverse descrimination, but that's not the story told by the wage gap.

I must echo the request of others here: if you have evidence to the contrary, plese provide it.

Comment Your are missing the point (Score 1) 333

Science is based on the belief that there is a real world out there that has properties anyone can discover. What made this world "real" was that these properties did not depend on anybody's opinion, so you didn't have to give a damn about anybody else's opinion of your research either; you could discover the truth yourself, and be right even if everybody in the world disagreed with you.

Now we have social science. It's based on the belief that reality is defined by majority opinion. Naturally, one man's opinion is worthless, and only when a consensus is reached can you state that you know anything.

I'm afraid you completely misrepresent both science and what you call social science (but isn't). The problem is not whether the world is real: the problem is how can we know what these properties are.

Truth is not self-evident, as you imply. In fact, science does not produce "truths" at all: it produces theories. Scientists gather evidence and construct theories to explain the evidence. This is inductive reasoning: it can never be 100% certain.

Science isn't something "anyone" can do, as you imply: in many cases it takes a lifetime of expert training to be able to assess scientific evidence - and even then, there are honest disagreements and mistakes. Take your field of expertise. Can anyone make sound judgements? Is the common sense of the amateur dependable? I'll wager not.

So, we have scientists evaluating evidence, but they don't all agree. There is always evidence that doesn't quite fit. A scientific theory is never perfect. (If they did agree, if everything fit, then they would move on to something else because that particular problem would no longer be interesting!) With these scientistific experts disagreeing, how are we to decide who is correct?

Consensus. Communication. Agreement does not make things true in the world, but it is the best method we have for trying to judge whose truth is the right one. And it is imperfect.

You have fallen into two errors: First, of believing that once Truth is found that fact can be known and reliably communicated. Second, of believing that the only alternative is to believe nothing is true and reality is the invention of majority opinion. You are wrong on both counts.

Such misunderstandings lie at the root of anti-evolutionary belief, and sustain conviction that climate change science is a fraud. A non-expert believes he has found the one critical piece of evidence that disproves the consensus, and becomes convinced that this overturns the science. Science isn't calculus. It doesn't work like that.

The debate over evidence and whether it is possible to know Truth is an ancient one, reaching right back to Plato. One of the most important and influential scholarly works of the 20th century (and the source of the term "paradigm shift") is The Structure of Scientific Revolutions by Thomas Kuhn. I highly recommend reading the whole book: every scientist should read it. There is a pretty good recent overview at The Guardian, of all places. (Though the last bit about science being data- rather than theory-driven is bunk. It is both.)

As for social science, fifty years ago it was caught up in the belief that it could discover scientific laws of society akin to Newton's laws of physics. Then in the 1980s and 1990s there was a widespread rejection of this position, which in many cases resulted in an extreme postmodern rejection of science as a special way of acquiring knowledge. Thankfully, both extreme positions have now been widely rejected.

Slashdot Top Deals

From Sharp minds come... pointed heads. -- Bryan Sparrowhawk

Working...