That's a beautiful, passionate, almost poetic response. Sadly, it is completely wrong. None of that stuff ever happens at any significant rate outside of your fantasy life. In fact, I can't recall having heard of any student who ever made any of those claims, although I'm sure that there must be some, somewhere, that have (personality disorders being as prevalent as they are in the general population).

Actually, students have a pretty finely honed sense of fair play. If you flunk them and they deserve to flunk, you rarely get more than whimpering and sometimes begging. Ditto for most other grades, and grade boundaries. If you flunk them on a technicality, of course, they will be resentful -- and with some reason. But no, students do not in general argue over grades claiming that the professor is biased against them due to race, creed, color, etc, except possibly in the very rare cases where there is some reason to make the argument.

As for learning analytics -- I have to say that it is (also sadly) mostly bullshit. I don't know about soft subjects, but in things like math and physics:

a) It is painfully, oppressively difficult to find a good object instrument to measure "learning" at the college level. And I say this as somebody that has used what there is for upwards of a decade. The instruments themselves are badly flawed and it is impossible to prevent an instructor from teaching to the test if they so desire (indeed, it is difficult NOT to teach to the test if you know what is on it and what weight will be assigned to outcomes in terms of "ranking" teaching/learning performance in the course.

b) There is nothing like standardization of the courses at the level required to build a uniform instrument that might be of some use. In Europe they have such a thing, supposedly, and too bad for them! If Joe gives a wussy, "physics lite" algebraic physics course but Suzie gives a tough, full calculus course covering exactly the same chapters, how do you even compare them. Now imagine comparing them and developing performance analytics when they don't even cover the same chapters from the same book in the same order and with the same basic understanding of the material they are teaching...

Here's a single example of the problems we really do face. I give all of my entering physics students an assessment to determine how much they remember of basic math. A page of algebra. A page of simultaneous equations. A page of differential calculus. A page of integral calculus. A bit of vectors and trig. Nothing difficult as far as calculus goes -- one can manage a typical intro physics course with five -- that's right, only **five** -- integral/derivative rules on board, plus the chain rule/u-substitution, plus the product rule/integration by parts.

Every student entering the class is supposed to have passed two full semester college calculus courses. Yet the mean score on the assessment is around 50%, with plenty of students scoring as low as 15 to 25%. And these are bright students at a very good university.

Forget "analytics". The problem is deep, not shallow. It isn't going to be solved by improved statistics on more tests.

rgb