There, fixed that wonky headline for you. I suspect you were posting with the new OS X Yosemite and just couldn't read what you were typing?
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
There, fixed that wonky headline for you. I suspect you were posting with the new OS X Yosemite and just couldn't read what you were typing?
The hasty closures and haphazard deaccessioning of these collections that represent substantial investments of taxpayer money over decades? Entirely the opposite of what conservatives claim to value - careful custody of a nation's heritage and citizen investment. (Canada's federal government is in the control of the Progressive Conservative party, hard at work muzzling the scientists supported by our tax dollars.)
From The Tyee's December 23 story on the topic, "What Driving Chaotic Dismantling of Canada's Science Libraries": Moreover records on library usage were overtly biased and based on who asked for help, said Burton Ayles, a retired director general for DFO who lives in Winnipeg and has used the Freshwater Institute library frequently.
"Most people that come in to the library don't have to request help. They just use the material. Just look at any regular library."
The catch-22 is that blogs, board posts and the like are rarely taken seriously. Students often fall back into the bad habits of their personal practice or rush to push out the mandated word count at the last second and with little reflection. Their posts can be incoherent and are commonly littered with typos. They hardly appear to be the work of an aspiring professional but getting students to understand and care about those distinctions isn't easy. Does your assignment include any sense the posts are assessed as compositions and not just as tasks to complete?
Not every professor puts a priority on building writing skills. Maybe yours doesn't in this course but you can still, in your own posts, model a clear and professional communication style. See if there are courses or workshops in disciplinary writing. Talk to faculty in your field about whose writing they admire or what models are useful for aspiring practitioners. (Hint: it's often not the most esteemed journals!)
Link to Original Source
There's also no evidence for human societies being comparable enough to make the same prediction (or solution) work from one to another. Some of the great attempts at comparative histories that I mentioned above, such as the Tillys' work on the Rebellious Century or Skocpol's comparison of France, Russia and China, have fallen apart when subject to closer scrutiny. Maybe if we had all the data ever created and the supercomputers to crunch it all, we could reduce human experiences to a mathematical model but that sounds a bit ridiculous. Most historians would suggest that humanity would benefit a lot more from some other endeavours. Like preserving, restoring and studying some of those degrading documents. . . .
So all of those lovely charts about crime rates over time start to look a lot less reliable when you poke into the data at the base level. When the definition of a particular crime changes, that has to be taken into account. What do we mean by infanticide? What did people in 1800 mean by the term? How many people really WERE in that gathering, assembly, march or riot? Were they there for the reasons that the official accounts claim or was it something different? How do you know?
I don't think any historian has a big objection to analysts making direct parallels between what they see today and certain events in the past. You're right that those kind of insights are useful (and they're the fruit of a lot of sociological, political science and economics research that's highly respected for good reason). It's when someone starts to talk about inevitable highs and lows or natural rules of society that we get antsy, because that puts a lot of faith in data that we've been trained to approach with caution and in the broad pictures of humanity that we're always diving into and reworking at more basic levels of analysis.
Second is the problem of complexity. There's so much data that's not recorded or doesn't survive, even in our modern age and moreso the farther back you go. Researchers who try to connect two seemingly-related phenomenon in different circumstances end up tripping up on the differences they've discounted, ignored or don't see until someone brings up the flaw. For current-day society, historians would ask what current events and developments aren't even appearing on the radar of contemporary analysis.
Historians still generalize, but under limited terms (comparing, say, the decline of industrial production in similar cities/cultures or pointing out similarities between the adoption of or opposition to Reformed religion in early modern European states). Even then, they often get hammered by colleagues who point out key differences they argue have been overlooked in the analysis. Given that critics can shoot holes in the comparison, especially as it broadens, historians have learned to be very carefully about generalizing from the past. We'd obviously be very skeptical of someone who claims they have drawn a universal lesson of human behaviour (and nothing in Turchin's work has really made an impact on historical scholarship, even though his book has been out for almost a decade and he's been promoting his cliodynamic theories for even longer).
This skepticism towards positing rules of human social behaviour is probably why the other social scientists don't always feel we fit in with their disciplines and keep putting us back over into the humanities.
Historians tend to say "You're cherry-picking your data" when the cliometricians or cliodynamists come to town. They're taking material from one set of circumstances and missing another, conflating events that aren't equivalent or sometimes simply misreading things that have changed dramatically over time. (Did you know that up until the 19th century in Europe, it was an established belief that women were sexually insatiable and utterly physical creatures, incapable of ruling their baser emotions while men were more spiritual and disinterested in sexual or physical matters. Nowadays, of course, western societies pretty much put it the other way around. But if you read something about woman's 'nature' in a fifteenth century source, it's coming from a wildly different assumption than a 21st century individual might expect. The same kind of 'false friends' exist in everything from laws to agricultural practice to religious activity. Nothing stays the same, even when people are trying to conserve practices!)
Historians also tend to emphasize how behaviours are socially constructed and how those constructions change over time. So, for us, to expect that the same rule would hold for 1830 and 1930 and 2030 is hard to believe. Furthermore, that it would hold for all societies in some cycle of development? Again, tough to swallow. I've looked at some of the material they're putting forward and it's not rocking my world yet, nor is it making a big impact with other historians even though Turchin's book has been out for years, now.
With regard to Turchin, Dewey & Kondratiev and their long-cycle models, I'd have to say that most historians would be with the doubters. Not enough's attributed to human agency as matters are described to cycle up and down in response to "natural laws". Never underestimate the power of a few well-placed individuals to screw everything up at unpredictable moments!
It's not inevitability, it's incidence. Social, technological or political change takes a long time to shake out. While we can talk about patterns, it's never a good idea to believe they're natural forces dictating the future. Instead, we'd talk about patterns in terms of parallels: see, when the printing press was developed and popularized, these are some of the effects that it had, direct and indirect. How can that inform our understanding of how the internet is changing modern economies, culture and education, say? But nobody who tells me something like, see, it took seventy years from the advent of printing in Germany for Luther's 95 theses to be circulated, so there's a cycle for you!, is going to convince me they have a useful scheme. I'd say that such pie-in-the-sky claims are counterproductive.
Turchin hasn't engaged historians nearly as much as he has economists and his interest in predictive modelling fits more in with that field than with history and historians whose chief interest remains in accurately documenting the past as opposed to thinking we can predict the future. That didn't work so well for Fukuyama, now, did it?
In the mid-twentieth century, cliometrics (ah, look how much it reads like cliodynamics!) was going to save us all from the loosey-goosey styles of history that just weren't as good as honest-to-gosh social science. (This is why many mid-twentieth century universities placed history in their social science faculties rather than humanities where it was categorized in older university systems.) Certainly, learning how to handle large data sets and tackle questions of change over time with accurate analysis has been good, but stats wasn't the smoking gun to solve historical debates. Look how hard some of the great works of cliometrics crashed and burned when they tried to assert a grand rule of human behaviour: just two examples off of the top of my head, the Tilly's "The Rebellious Century, 1830-1930" which tried to unify the study of European revolutions over a century or Theda Skocpol's "States and Social Revolution: A Comparative Analysis of France, Russia, and China" which claimed that you could come up with a universalizing analysis of authoritarian state collapse. Both are interesting and ambitious books but ultimately unconvincing as they attempted to assert a general rule-set for history.
Now we're told that cliodynamics is going to solve the problem. Again, as the original article notes, most trained historians are skeptical. It's not just that we like futzing around with old documents, it's that we're aware of the weaknesses in ongoing research, holes in observations and the biases in the data. You want to point to huge amounts of populist violence in the U.S. circa 1920 as proof that it was a high in a fifty year cycle? I and other historians can point to stunning outbreaks a decade earlier related to the anarchist movements and a decade later with the unrest regarding the Great Depression. It's not so much cherry-picking counter examples: it's the wrongheaded concept of seeing people as pawns of historical forces. Asimov was fun to read, I'll grant you, but I'd hope that people can see that human agency has an awful lot more to do with historical change than the rules of psychohistory.
Stop looking for general rules of what's going to come next and consider, instead, clear-sighted analysis of how we've come to where we are and what that tells us about problems we've had and continue to experience.
If you're going to pedant, pedant all the way!
I'll likely follow your suggestion for my own class. I have six weeks in which to get 60 history majors excited about and reasonably adept at statistical analysis: creating a keystone project in which they present their research with their own visualization of the data would be a great way to finish the unit.
I know - I teach a stats module as part of my sophomore course for majors. They learn how to read, interpret and critique statistics in articles in their field of study. Did you know that most of them don't know how to read and interpret statistics? The number of students at the start of the course who tell me they don't stop to read the charts because "they'll never understand them" is staggering. Statistical literacy should be the bedrock skill you inculcate. Show them good and bad uses of statistics. Teach them to figure out when someone's playing fast and loose with figures, hoping to fool readers. That will build their confidence and their thirst for knowledge.
My students go on to create their own time series and other statistical outputs from a dataset that they all find fascinating. (I use the Old Bailey Online for this, a website with material in statistically manipulable format for almost 200,000 trials at London's major criminal court: almost everyone finds the history of crime at least a little bit intriguing and so they will persevere a bit more when they run up against problems or road blocks.) Don't waste a lot of the time throwing new theories at them - make sure that every new concept you introduce is tied to something they'll want to and be able to explore.
Sure, some won't want to try. They'll find the work too hard or uninteresting no matter what you do. But others will be able to master this if you make it clear both why they need to learn certain techniques and how while giving them some clear and jargon-free walk-throughs. Exercises they can tackle tied into the fields they already find interesting are a great way to keep them motivated.
Look at some of the textbooks that are out there for stats that are directed to your U's social science fields - see what elements they emphasize as important for the field of psych, poli sci, etc., and then decide how you want to incorporate those key elements into your own teaching. Avoid getting too tied into teaching a particular software package - make sure they understand how to generalize their application.
Good luck - you're tackling what many consider a thankless course but one which can help to change students from math-phobic and fearful to at least statistically literate and confident that they can understand and apply some basic skills in the field as they go on in life.
No, they haven't but they're mandated by law to administer these tests and the law then uses the results of these tests to justify firings and closings. If you put a piece of cheese into a maze and deposit a very hungry mouse at the start of the maze, are you surprised when they get through it as fast as possible to get to the food? Same goes for underfunded, even adequately funded schools whose staff knows their future rests upon the test.
Every time you hear a politician demanding new types of accountability and more evidence of outcome in schools, colleges and universities, know that what they're really saying is that they're putting yet another unfunded or underfunded mandate upon the education system. Good educators, and there are plenty out there, Aren't seeking to hide their achievements but every time you agree with the schemes of politicians that give us stuff like "No Child Left Behind" you add a new standardized test (created and assessed by a for-profit institute that'll also sell your schools the needed textbooks and prep materials to ensure student success).