Comment Re:Troll... (Score 2) 361
No one is allowed a personal database without IT approval?
No one is allowed a personal database without IT approval?
If someone needs to do "real work" then I think MS Office is the wrong tool. Word is ok for doing your homework but it struggles when trying to deal with 1000+ page documents.
Yes, I would not get MS Office for home if I had to pay for it. It's occasionally nice to have but not worth the money. I don't write documents, don't use spreadsheets, and certainly no presentations. Once every few years I update the resume, and if I didn't have open office I'd still be updating it in raw postscript.
Hmm, programming for a product is generally managed by the software group or the engineering group or the product group. Even if we're just talking about a web site I usually see those developed separately from the IT group.
No I don't. Police could call in missile strikes now if they could and it was practical. Drones add nothing new.
Ted, is that you?
In the US we don't divide the curriculum based on what we think the students will do in the future. Some countries do this however. They may have a pre-college high school separate from a trade oriented high school. In the US though we give the same education to everyone, rich or poor, with educated or uneducated parents alike. So we do want to teach good math and science to everyone, because you can never predict who will need it in the future. Over time the student decides that they can't handle the college track perhaps.
I think this is in conflict with many corporate leaders who would prefer that schools just churn out a compliant and viable work force.
"IT" is not "programming", so "IT Project Management" should have nothing to do with managing programming projects. IT is generally service level jobs keeping a computer information infrastructure working, which may need some programming. But usually programming is very often done outside the IT group. The iPhone was not designed and programmed by the IT department; the IT department did not program the robotic controls for the Curiosity rover. People should stop using IT synonymously with computing and programming.
I think mismatched skill sets are often a matter of perception. If someone wants more money than someone else and at the same time may need some tiny amount of adjustment, then the employer says that person is not a match for the job. A good programmer can program in ANY language! If you can do C++ you can do C very quickly even if the resume doesn't show it, and that person can most likely get up to speed with Java quickly as well. HR won't believe it though, they want to see keyword matches. And if the person has actual EXPERIENCE then they will be able to adapt to what you throw them; that programmer will pick up Java very quickly. After all, most jobs today are changing constantly, so the employee needs to change and adapt. But again the person with experience is also the person likely to have the higher salary requirements or moving expenses, and so HR is much more likely to say "not an exact fit".
And almost everyone needs retraining. No one is ever a perfect fit for a job unless they were a former employee. Everyone needs to learn the process of the group, everyone needs to figure out the corporate culture, everyone needs to learn the specific code base or tools involved, etc. The jobs where there is little retraining involved also tends to be for cookie-cutter jobs (ie, anything where the requirement involves a Microsoft Certificate) and which unsurprisingly turns out to have a higher percentage of H1B workers.
I have seen some positions go to H1B workers when others with essentially the exact match to the job descriptions were denied the jobs. I have seen unqualified H1B workers being hired. This is being actively abused.
If you use an abacus, a very ancient device, these ideas come naturally. Even though the abacus is just a calculator it does not hide things from the user. With a digital calculator you essentially have a function that turns two numbers into a third number. Similarly with slide rules, you could do multiplication by using logarithms and it wasn't hidden from you. Further with the slide rule you got a very good feeling for the scale of the inputs and outputs, how you got more precision on one end and low precision on the other end. Whereas today I see programmers not understanding the basic concepts of precision and scaling and they treat floating point numbers as magic entities.
Are we going to have problems with calculators that are unable to display how much of their floating point results are accurate so that the student thinks that those 10 digits printed out are all precise? I see programmers who don't understand this stuff.]
Yes, at a certain point you can stop learning the arithmetic. In college you don't have to do long division anymore. But in college you should understand the concepts behind long division well enough that it's internalized.
I think students need all of it. Learn how to do the arithmetic by hand, plus learn the formulas, plus learn what the formulas mean. If you focus on just one area the student will lose out.
We need both. If you can't calculate by hand you don't actually learn many vital concepts. Ie, how does a computer do division? The same way a person does division. The person who designed the chip to do the division is unable to do so without first knowing how to do it the long way. Even if you're not designing a chip you still learn some fundamental concepts by understanding the process.
Do you even understand what 7% tax means if all you know is how to plug in numbers into a calculator? If you just use a calculator maybe everything just end up being opaque magic numbers that have no meaning? Do we want future workers to all be like the dysfunctional programmers of today who can't do a basic algorithm without just copying from somewhere else, and they can't do any programming more involved than gluing pre-existing modules together?
How do students even know when the calculator and computer are wrong because it got bad input? If they punch in a wrong number and the computer says 7% tax on $57 is $8, will they recognize that it is wrong?
In the past people used slide rules and I think those had a big advantage because you got a feel for what logarithms were. I think calculators have taken that away so that you have a much more vague idea of what they mean or how they relate to multiplication and so forth. And that's with an advanced idea of logarithms, if we simplify even basic arithmetic how much more do you lose?
And yes, pragmatically speaking not everyong needs to know the hard stuff, and most of these "let's do education differently" programs are really about making better factory floor workers or IT service job workers. That's not necessarily what we want from education.
I've been at a computer store where a network was down, and the staff was barely functional and unable to do basic arithmetic or bookkeeping. They had one person with a calculator, one person reading instructions, and one person reading off the prices of each item. Today I very often find people who can not do arithmetic giving me the wrong amount of change.
It's a mix. Most students ultimately just do the minimum necessary to succeed in college. They'll never really stand out in their jobs, though sometimes they get promoted to management where they do the least harm. However there are students who get quite a lot out of their education and who do learn stuff. Even if it turns out they never use combinatorics on their job the fact that they took the time to learn it well has helped them train their brains (applies to just about every single class where the bottom half of the class says "this stuff is useless").
Yes, students occasionally get the clas taught by the poor professor who's just droning from a book. But that does not happen in every class, not even most classes, and yet people still point to that as examples of why things are broken. A student must put in their own effort, especially in upper division classes, and not rely on the professor to spoon feed entertaining education.
The old college system is still just fine for new tech; after all it was people who were educated in the old college system who created that new tech. Now if you just need grunts to keep the new tech working then DeVry or ITT Tech is fine for churning those out in large numbers. Most IT jobs are just grunt jobs anyway. If you don't want to do IT though and instead want to create something new or expand the field, then you will need a good education (and no naive pointing to Gates and Zuckerberg as your dropout role models).
I started school in 81, and there were a LOT of people there expecting it to be the high paying easy-retirement job of the future (well, their parents thought that anyway). There were students in the first programming class, which was not that hard and was even self-study, who were clearly not suited to the profession. But almost all of them insisted that they had to pass the class and that they could not change majors. That attitude just has not changed much.
The real weed out courses weren't until upper division classes really, I don't think even assembly got rid of too many since you could work on that in teams, and intro to numerical analysis wasn't a big hurdle for those already taking tons of science and math classes. It's a much bigger shock to the system when your weed out courses come a couple years after starting school and it really is a major shock to change majors.
I always liked the MIT approach of putting this up front in their Scheme curriculum that covers some theory and data structures and algorithms all in the first year.
Things are not as simple as they seems at first. - Edward Thorp