The latter approach is the better one in the long term.
You are expected to learn any programming language to actually use
by yourself. The university gives you the basics, a broad overview,
and the tools you need to learn these.
It is also a common mis-conception that youâ(TM)ll be employed immediately
after finishing university. No, really not. University does not prepare
for the workplace. You need more real-life experience than that, which
is why universities (donâ(TM)t know about America, but over here they do)
require at least one internship be passed before the diploma can be
achieved. You will learn real-life work skills there, and (in case of
a programming job) get real-life programming experience, which is dif-
ferent from merely learning a programming language.
Trust me if I say that, with some shell languages such as mksh, you
can both _script_ and _program_ in shell, such huge is the difference.
Most never see the latter niveau.
Iâ(TM)d suggest you use the university time to _really_ get to know the
basics of as many languages as you can â" including functional and
other âoeweirdâ languages. Gwydion Dylan, Haskell, LISP, you name it.
Then, do your assignments in various languages for play; youâ(TM)ll find
out which ones you like/dislike and which ones are better/worse suited
for the task at hand. Bonus points (to you only) if you do some of the
assignments in two languages (probably using *different* algorithms â"
tailor the algorithm to the language used, not to the theory related
to the assignment, as academics want to make you).
Note I know both the academic world and that of âoecraftmanshipâ, Iâ(TM)ve
seen both sides.