The post by "ennuiner" is the only post that makes any sense.
The real issue (original post not withstanding)
is not lectures, but learning. Many posters point out
that lectures aren't the best or only way to learn.
They are right, but don't rule out lectures as a valid mode
for many. Instructors should include other modes as well.
Google "millenial teaching styles" for a start, eg.
http://www.elearningmag.com/ltimagazine/article/ar ticleDetail.jsp?id=262368The good news is that funding agencies like NSF are on the
bandwagon, and some profs are struggling to change (eg, me.)
This discussion shows how slanted slashdot's contributors
are towards nerdy arrogant know-it-alls, missing the point
of a college education. I guess most of them SHOULD have
"gone to" the U of Phoenix,
or just sit at their computer and download an "education".
(Isn't this what they call "autism?")
Passive listeners in lectures don't learn much in many cases.
If you do, you are lucky.
If you have a chance to talk with other students
and your profs about ideas and specific problems,
you'll likely learn a lot more.
Back in the day, I had to go to the "computer lab"
to use a terminal, and being able to ask somebody
a question when I was stuck made a big difference.
If the college admins would just get a clue,
the profs wouldn't be forced to spend so much
effort "lecturing", and could get on with actually
teaching people. The real learners are the students
involved in research. If all students could be
in the position of a research assistant, they'd learn
something. Those that aren't up for it should maybe
consider a trade, or "business school."
One-size-fits-all (i.e., traditional lecture format)
resonated with industrial age and social darwinism.
We need narrowcasting of tailored content with
interaction for most people. The autistic nerds
will find their own way. We are leaving behind
a huge population by traveling the well-worn path.