Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:This is what the polls say (Score 1) 739

I've heard this before, but the problem is that those items that are referred and somewhat spun in a positive light are cherry-picked. In particular, they don't talk about the "items" of how this all gets paid for (more taxes, more deficit, taken out of medicare, etc...) and the general downstream ramifications of the law. Sure, there is partisanship going on, but it's not what this study suggests.

Comment Agile requires certain assumptions... (Score 2) 133

...to be met, like being able to be completely interchangeable with other members on your team, and having prior art to reasonably predict every aspect of effort. If that's not the case (say, in an R&D project where certain people are specialists in certain areas), this method does more harm than good. My best suggestion for using which method is to let the nature of the project choose the method, rather than the other way around.

Comment was in your exact position.. (Score 2) 479

..several years ago, without the C++ experience. I was applying for a good 5+ months. I was fortunate to get hooked up with a research institute associated with the university for a year doing more grunt-workshy stuff while I finished up my dissertation. It gave me some experience in image processing, which IMO is one of the most in-demand fields to get into if you want to stick with industry research. That was that on top of the Ph.D that got me my current position.

Comment Not only this... (Score 1) 348

...but it also puts more and more pressure on principal investigators to color their conclusions in the direction of whatever is currently trendy in the eyes of the grant reviewers in that field in order to get future grants. It's not good.

Submission + - NSA Phone Program Likely Unconstitutional, Federal Judge Rules (huffingtonpost.com) 3

schwit1 writes: A federal judge ruled Monday that the National Security Agency's phone surveillance program is likely unconstitutional, Politico reports.

U.S. District Court Judge Richard Leon said that the agency's controversial program, first unveiled by former government contractor Edward Snowden earlier this year, appears to violate the Constitution's Fourth Amendment, which states that the "right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated."

“I cannot imagine a more ‘indiscriminate’ and ‘arbitrary invasion’ than this systematic and high-tech collection and retention of personal data on virtually every single citizen for purposes of querying it and analyzing it without judicial approval,” Leon wrote in the ruling.

The federal ruling came down after activist Larry Klayman filed a lawsuit in June over the program. The suit claimed that the NSA's surveillance “violates the U.S. Constitution and also federal laws, including, but not limited to, the outrageous breach of privacy, freedom of speech, freedom of association, and the due process rights of American citizens."

Comment Re:Encountered this kind of thing ... (Score 1) 204

It could, but the question is should it.

Your complaint about C++ is like a CGA not knowing how to use Simply Accounting. You got a degree in computing science. Not C++ programming. Yes, C++ programming is useful, and yes, you might have wanted to take a course in that, but its not related to the degree.

And who's defining what a CS degree this way? My point is that if we re-define that to include more on-the-job elements, the overall value of the education will go up. Any time you have an organization (not just higher education) be the primary evaluator of itself, over time you will get policies and groupthink are more and more out of touch with reality.

True, but industry wants the other extreme, and nothing will satisfy them. Bottom line is if they want an employee who can do X. They can train that employee. That's how it works in other industries.

I think there's an easy middle ground to shoot for here. And note that industry are the ones providing the jobs, not the school, so they should intrinsically be in the driver seat more than the school should.

In the upper level compsci courses i took the languages were treated as a tool, not a subject. We were given a 1 day primer and the manuals, but it was a course on "advanced computing topic" not "language". You could have taken it on yourself to learn C++; if you can get a compsci degree you can learn C++. Take a correspondance course, write and release a freeware program. Done.

I'm surprised it took you until the interviews to catch this out. This should have come up earlier -- surely the career counselors would have caught it, or the work experience programs and internship placement stuff, or even just talking with other students and graduates and profs.

My university had a couple C++ electives I could take, including one by correspondence. Usually worth 1 credit. Or you could take C++ separately from university. I took a C++ windows programming course that looked at MFC and COM and the windows event model from an academic point of view. You didn't need them to graduate but everyone was encouraged to take them as they were "relevant" in the job market. The profs were all well aware of C++ in the world, and just thought it was a terrible teaching language for most subjects.

I'll be the first to admit wish I would known more about what was going to be coming my way after graduation, but I don't think that should let them off the hook. I paid them to educate me so I could get a job. Suppose I go buy a car from car dealership, but they don't put any oil in the engine (after all, they are selling you a car, not oil). You would call the person naive about cars if they ignored the engine light and tried to drive it as is, but that fact doesn't change that the dealership is dropping the ball. And having them "suggest" that you take additional/supplemental courses to prepare you for the job market is just an admission that their curriculum is sub-par. This again comes down to expectations. They don't better prepare you for post-graduation in their curriculum because they don't have to, and thus it detracts from the very point of higher education existing in the first place. My prediction is that if/when someone is able to change this system so that schools held to a better standard of preparation, the overall higher education quality will suddenly much more valued in society, and that person will go down as the genius who revolutionized the outdated western higher educational system.

Comment Re:Encountered this kind of thing ... (Score 1) 204

Work pays you because you understand calculus. You pay school to develop that understanding. School and Work are not the same.

You miss my point - school could do a better job preparing you for work.

Because my boss expects me to already understand what I'm doing. The entire reason you go to school is because you don't know, and you want to learn.

There's often times when I don't understand what I'm doing when I receive an assignment and I have to go teach myself something new to get it done.

You are expected to make mistakes and learn from them.

At my job? No - I'm expected to learn whatever I need to get it right the first time. At school? I had several courses where the professor told us straight up front he/she thought tests were stupid and out of touch with the real world and told us the majority of our grade would be homeworks and projects. Have to say I think I learned the most out of those courses, and being expected to get it right on the homeworks was not a hard thing at all - it made you take it more seriously than just assuming you would just bump up your grade at the final.

University is not grade school my friend, "homework" is your "in class quiz" and "practice problems"; its just not in class where its a complete waste of the limited time you have with the prof. I don't know about you, but I got 3 hours of class time a week. Depending on the course, I got another few hours of TA or supervised lab time. That class time was for the prof to explain the material, and answer questions. It would have been idiotic to use that time do practice problems on our own.

Well I had both in-class ones and ones where you submitted electronically before class. It made sure the students read the material before class and the lecture (which focused on the tougher parts of the material) wasn't wasted on the easy stuff. Seemed to work pretty well from my standpoint.

Why? In the real world your boss expects you to do Y, and assumes you already know the prerequisite A,B,C. School is where you learn your A,B,Cs.

Yes, and they can do a better job with teaching you about Y as well.

University a isn't trade school.

And this is precisely the problem. It should be, in part. And if the professors don't want/can't do it, then find better ones that will.

I didn't get a degree in computer science so that I could learn how to deploy Active Directory, how to properly configure Apache for security, or program against a given library/API that's popular "in industry", or learn R. You graduate with a degree in comp sci and you should know how computers work, how compilers work, how networks work, how programming languages work, what prodecural, functional, and object oriented are, you'll know about recursion, you'll know about concurrency and resource locking, semaphores and critical sections, atomic transactions, you'll know about AI, or SQL, you'll know how algorithms work, how stacks works, how fundamental data structures work, how to compute performance characteristics, etc. I use a lot of this knowledge all the time, and even the stuff I don't use I'm aware of and can recognize its applicability when it comes up. So when I get asked whether an R program used on inputs of n=3 or 4 runs in 5 minutes to an hour how big can they go... and I have the tools to analyze it and say its O(21^n) and you can get up to be about 6 on a PC, maybe 9 or 10 on a cloud platform for some $$$. (real example by the way from a couple weeks ago). The algorithm was about as good as it could be, so we analyzed the problem itself and came up with a new way of defining the solution space, that could be evaluated and searched in O(n^3) and give us the results we needed. And new we can go to n=20 and beyond on a laptop. That's what I went to university for.

The stuff industry wants, best practices for coding Java interfaces, deep API knowledge in whatever API they happen use that week (net, java, qt, whatever), how to use Team Foundation Server or git or whatever they happen to use that week, or whatever else (to use programming as an base example), is what you get from doing the job.

Perhaps there should be programming apprentiships and IT apprentiships in much the same way there are for plumbers and electricians, but plumbers and carpenters get PAID ON THE JOB to learn that stuff, and so should IT and software developers.

Industry wants everything for free. Frankly, if industry had its way, university graduates would have studied their employee orientation & policy, ISO procedures, and would already know where to park too.

Well I got a degree in CS to get a job related to CS, period. I really didn't have any expectation as to what I was going to learn, precisely - I trusted the school that they knew what I needed to know. I found out after I got out of school that there were a lot of things related to the computer industry that they didn't even mention once, let alone teach about it. And there were other things that they mentioned but gave you the distinct impression that it wouldn't matter if they glossed over it and get on with "more interesting" topics. Well guess what - interview after interview I kept running into the basic qualification questions. An example is "have you programmed in C++ before?". Sad to say, but somehow in all my coursework (undergrad and grad school), no. Other imperative programming languages, yes, and mostly functional languages like Scheme that no one in industry uses. I knew lots other really cool, advanced CS stuff that their company could use, but unfortunately that's not a substitute for C++ experience (coursework would have been acceptable). Could have I taught myself? Of course, but at that point it's too late. I need to get a job, not learn more. I'm not even talking about "flavor of the month" technology stuff, I'm talking about basic industry knowledge that a) many companies wants b) doesn't change much or too fast. It's not beyond the school to include/emphasize it more, it's just that they don't want to, and they won't change because currently they aren't held accountable to the type of education they provide. Sure, it's great by their standards, but their standards isn't the one people care about at the end of the day.

Comment Re:Encountered this kind of thing ... (Score 1) 204

I disagree.

Thanks to wolfram alpha, stack overflow, and the internet in general, homework is as meaningful or as meaningless as the student chooses to make it. Most professors assigned 5-10% of the grade to homework, and that was usually just for completion, not for correctness.

Homework is an opportunity to practice and develop and understanding of the material being taught.

These statements go against what I have experienced in the real world. The internet and such is the primary source of information for me, even with my text books sitting next to me on my desk. The ability to look up stuff and self-teaching on-the-fly is much more useful than remembering every last thing that was taught to me in school. That's not to say there's no value - being able to do calculus and such by hand is necessary because you need to understand what's going on, but no one is going to pay me to just sit there and do calculus problems by hand all day. I use a calculator, even though I know how to do long division - it's faster and less error prone. I'm not saying do away with testing altogether, but rather use the real world as delimiter of where emphasis of knowledge and compency should really be put, rather than some professor's pie-in-the-sky notions.

Also, you usually don't get "redos" on an assignment from your boss - it's your responsibility to identify if you don't understand something and to ask for help right away, not after he's "graded your work". Once it's on paper (or checked into a repository), it's expected that you've done what you can to get it right the first time. And the professor can always hand out practice problems or in-class quizzes for the competency check, but the major evaluation should mirror what is desired in the real world.

"Projects" are also rife with problems; with much of the grade reducing to grading the student skills at project management itself. That's an important skill too, but not the one that should be getting tested. (Although to be fair, taking tests is also a skill unto itself, but "test taking" is easier to learn to do well than "project management")

And group projects have all kinds of other issues. In my experience the only people who like group projects are people who treat it like free marks some of their classmates will do for them.

They don't have to be group projects, they can be individual ones.

I guess my overall comment here is that because professors have a Ph.D. in their discipline they think that means they know what's best on how to teach. That's not true. My opinion is that everyone who wants to teach at a college or university should have some sort of education training (you know, from the Education Department) to dispel a lot of the silly, off-in-the-clouds notions that many professors think are good teaching practices that people in the Education Department would be quick to squash. They also think that because they are the "higher education pinnacle of society" that they automatically know what the real world needs and that the real world conforms to their ideas of what a person of that discipline should know to be useful and functional. Also not true. It's shocking how not true this is. There needs to be more of a vocational emphasis in higher education, and professors need to teach more of the things that they currently like to raise their noses at, but are the very things that industry constantly complains about with fresh graduates and ends up having to do the teaching themselves on the job.

Comment Re:Encountered this kind of thing ... (Score 1) 204

I understand the difficulty, but in the case of advanced math courses, many of them don't have a lot of students in them, which deflates the argument those who would receive an F in a curve probably would have received and F anyway. IMO, the more advanced the course (especially in grad school), the less tests make sense at all for evaluation. It's better to have projects and homework be the primary evaluator since this is more in line with handling "real world" scenarios - which in theory is the point of giving out grades to begin with.

Comment Re:Encountered this kind of thing ... (Score 1) 204

I see your reasoning on how it factors out the variability amongst professors (which is a good thing), but an even better solution is to change the professors themselves - either by forcing a review on their teaching/evaluation methods, or removing the professor altogether. The job market for academia has been hyper-saturated for quite some time now, so it should rather easy to cycle through them until you find good ones.

Comment Re:Encountered this kind of thing ... (Score 1) 204

Bell curves "work" in academic settings because there's hardly any accountability imposed upon tenured professors for how they evaluate students. It's continually shown how grades (as of right now) are a poor predictor of success in the outside world, yet this continues to be ignored in the practical sense in academia.

Slashdot Top Deals

E = MC ** 2 +- 3db

Working...