Comment Re:What's old is new (Score 1) 409
"Manage your own onsite cloud!"
You mean.. a server?
"It's the clouuuuuuud"
"Manage your own onsite cloud!"
You mean.. a server?
"It's the clouuuuuuud"
Clearly it depends on what your specific development interest are, and the way that your professor presents the material.
In my course, solving linear systems was maybe the first week, and then on to bigger and better things.
The lecture I heard on eigenvectors was so ludicrously inspiring that I almost couldn't sit through the whole thing without running back to my dorm to write python. To think, that it allows you to define an "anchor" in an arbitrary transformation. You can interpolate at any point along the eigenvector and create an animation! Also, regarding computer vision / OCR? LA is HUGE.
Thinking in terms of matrices also enables making parallel code where another developer would miss it. If your code isn't performance intensive then this doesn't really matter, but it's the foundation of why there's a huge population of interest behind CUDA/GPGPU's. If you hear all the hype about functional programming / map -> reduce etc, the whole point with it is allowing you to process enormous baskets of vectors (ie: matrices).
The fringes of what I know about computer science always seems to sit adjacent to a linear algebra problem. I'm sorry you had a poor experience with it.
Linear Algebra > all other math classes. I feel like it should have been the grade school progression immediately after basic algebra, learn matrices. Then use matrices for everything after that. Teach the linear algebra method of a thing before you teach any other method.
But then again I'm a zealot.
I agree that what I learned in calculus has been meh. I wouldn't have lost anything if instead of taking the course, someone had sat me down and said "a derivative is a function that represents the rate of change (slope) of a function. An integral is effectively the reverse operation."
Maybe it would be different if I was a quant or something?
I never understood the massive emphasis placed on recursion in college. Was it because it's somewhat unintuitive? That must be it, since it couldn't be taught because it's useful.
Using it immediately opens you up to stack overflows. Unless your language decides your method is eligible for tail call optimization. Even then, that hampers the code's maintainability immensely, because if one guy makes a change to your data processing recursor that precludes it from tail call optimization, you suddenly get stack overflows when you didn't before. I have solved thousands of problems that I could have used recursion for. Iteration is always better, except if you're going for a solution that allows you to stand up in your cubicle and go "SHAZAM!" after you build and it works.
Once you fully understand recursion, you also understand why you should hardly ever (epsilon% of the time) use it. It's just a parlor trick colleges use as a tool of discrimination. It works well enough at that I suppose, but I would consider using it often to be a bad habit, like shooting pennies and confetti out of your sleeves during dinner.
Executive Enterprise Super Government contracting. It's the same up here.
Switch to a different provider? Not where I live.
Also, the established players have been colluding in the environment you're describing for over a decade now. If comcast can't survive "the onslaught that is high speed internet at affordable prices" then they're terrible at business.
"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein