It's been about ten years since my TAs and I taught the lab section of the advanced digital logic design at my university. I agree that, generally speaking, VHDL is a better teaching language than Verilog. Part of the reason is that Verilog, being much like C, is inherently procedural. You don't want to think procedurally with digital logic except for the specific case of state machine design, and even then you have to take into account concurrency. It is this fundamental aspect of concurrency in HDLs that is key to being able to design effectively. I can define twenty clocks going into counters, just like I can wear twenty watches on my arm and have them all tell time independently and/or at different speeds. You can't really do that with procedural languages unless you're talking about thread scheduling, and then this becomes a thread scheduling exercise when you have multiple threads. Even then, you will never be able to get the speed of digital logic because you have instruction fetch, instruction decode, etc. that introduce latency that cannot be reduced even in a multi-core CPU. Not thinking procedurally will help, and the strong typing of VHDL over Verilog will help greatly in my opinion. Those Karnaugh maps you talk about are fine to learn, but HDLs use case statements in VHDL that make state machine design trivial especially when you have >8 states.
Beyond HDLs, however, are FPGAs and ASICs (and I've designed using both). Putting the differences between FPGA and ASIC aside, FPGA has some very specific ties to the vendor because of the way the FPGA is architected. Assignment of I/O, synthesis, and most of all timing constraints for guiding the "map place and route" tools for FPGAs are something you won't learn from VHDL alone (e.g. clock domain frequencies, max/min delays, input/output delays, false/multicycle paths, setup and hold times or worst-case timing paths in the design). These are essential to digital design, but not part of the HDL at all (see Synopsys SDC format for more info). In fact, shell scripts, sed/awk, Perl, TCL, Scheme and Python are also essential to know because they glue the various different tools together through scripting, processing of text files, tailing log files, and batching can be critical to being efficient. So is being thorough in understanding log file warnings and errors, timing reports. Electronic Design Automation or EDA tools also have their own idiosyncrasies, and you'll need to develop a stable "reference front-end and back-end design flow" if you haven't already. Do you use an Altera or Xilinx reference board, or an add-on PCI-based FPGA card? And how do you analyze what's coming and going at the interface? All of these questions need to be answered before you really get going on FPGAs. ASICs have an order of magnitude more complications for reasons I won't even discuss, but it just gets harder. So those state machines that you created without K maps will have synthesis pragmas that direct the compiler to create the appropriate state machine (e.g. One-hot for performance, Gray code for lower power, etc.).
Finally, there's the work world. As other posters have mentioned, North America is primarily focused on Verilog while the rest of the world is VHDL. Most synthesizable IP cores for various functions come as Verilog. So, the truth is, you should know both major HDLs, but you would be better off being proficient in Verilog in the real world for the simple reason that it is the present and future (or at least its successors, such as System Verilog, are the future) are for many reasons. Also, in the work world, it's critical to know the major EDA vendor software and to put it on your resume (i.e. Mentor Graphics, Synplicity (for FPGA), Synopsys, in roughly that order, and Cadence and Magma for ASIC) as well as all those scripting and other languages like Perl and TCL that I mentioned. Don't completely ignore VHDL, however.
As an ironic point, there are SystemC compilers for hardware that are becoming more and more crucial in large scale development for video algorithms and the like. The results tend to be difficult to formally verify (i.e. C code != Verilog out) and often inefficient or even not realizable in physical design, so you may need to go in and modify pieces heavily to be able to close timing. At your level and given the class of devices you're working on, it's quite premature to consider this, and I would strongly recommend to focus on the HDLs first. In fact, I don't even know if there's a university program for that software either.
My best advice here is to focus on learning digital design and VHDL first, then Verilog, then seeing how it applies to an FPGA kit (e.g. Altera Cyclone or maybe even MAX CPLD, or Xilinx Spartan) from a major vendor, then learn some of the other industry-standard EDA tools that work with their tools (e.g. Synplicity FPGA compiler, Mentor Modelsim waveform simulator for test benches and back-annotated (i.e. timing-aware) simulation, possibly some formal verification tools as well). This topic is way too big for even this post, and I feel like I'm swatting at a cloud of flies trying to get rid of them, but knowing FPGAs will greatly help your career since advances in process technology are making FPGAs more SoC like and cheaper all the time.