Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Submission + - NSA Phone Program Likely Unconstitutional, Federal Judge Rules (huffingtonpost.com) 3

schwit1 writes: A federal judge ruled Monday that the National Security Agency's phone surveillance program is likely unconstitutional, Politico reports.

U.S. District Court Judge Richard Leon said that the agency's controversial program, first unveiled by former government contractor Edward Snowden earlier this year, appears to violate the Constitution's Fourth Amendment, which states that the "right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated."

“I cannot imagine a more ‘indiscriminate’ and ‘arbitrary invasion’ than this systematic and high-tech collection and retention of personal data on virtually every single citizen for purposes of querying it and analyzing it without judicial approval,” Leon wrote in the ruling.

The federal ruling came down after activist Larry Klayman filed a lawsuit in June over the program. The suit claimed that the NSA's surveillance “violates the U.S. Constitution and also federal laws, including, but not limited to, the outrageous breach of privacy, freedom of speech, freedom of association, and the due process rights of American citizens."

Comment Re:Encountered this kind of thing ... (Score 1) 204

It could, but the question is should it.

Your complaint about C++ is like a CGA not knowing how to use Simply Accounting. You got a degree in computing science. Not C++ programming. Yes, C++ programming is useful, and yes, you might have wanted to take a course in that, but its not related to the degree.

And who's defining what a CS degree this way? My point is that if we re-define that to include more on-the-job elements, the overall value of the education will go up. Any time you have an organization (not just higher education) be the primary evaluator of itself, over time you will get policies and groupthink are more and more out of touch with reality.

True, but industry wants the other extreme, and nothing will satisfy them. Bottom line is if they want an employee who can do X. They can train that employee. That's how it works in other industries.

I think there's an easy middle ground to shoot for here. And note that industry are the ones providing the jobs, not the school, so they should intrinsically be in the driver seat more than the school should.

In the upper level compsci courses i took the languages were treated as a tool, not a subject. We were given a 1 day primer and the manuals, but it was a course on "advanced computing topic" not "language". You could have taken it on yourself to learn C++; if you can get a compsci degree you can learn C++. Take a correspondance course, write and release a freeware program. Done.

I'm surprised it took you until the interviews to catch this out. This should have come up earlier -- surely the career counselors would have caught it, or the work experience programs and internship placement stuff, or even just talking with other students and graduates and profs.

My university had a couple C++ electives I could take, including one by correspondence. Usually worth 1 credit. Or you could take C++ separately from university. I took a C++ windows programming course that looked at MFC and COM and the windows event model from an academic point of view. You didn't need them to graduate but everyone was encouraged to take them as they were "relevant" in the job market. The profs were all well aware of C++ in the world, and just thought it was a terrible teaching language for most subjects.

I'll be the first to admit wish I would known more about what was going to be coming my way after graduation, but I don't think that should let them off the hook. I paid them to educate me so I could get a job. Suppose I go buy a car from car dealership, but they don't put any oil in the engine (after all, they are selling you a car, not oil). You would call the person naive about cars if they ignored the engine light and tried to drive it as is, but that fact doesn't change that the dealership is dropping the ball. And having them "suggest" that you take additional/supplemental courses to prepare you for the job market is just an admission that their curriculum is sub-par. This again comes down to expectations. They don't better prepare you for post-graduation in their curriculum because they don't have to, and thus it detracts from the very point of higher education existing in the first place. My prediction is that if/when someone is able to change this system so that schools held to a better standard of preparation, the overall higher education quality will suddenly much more valued in society, and that person will go down as the genius who revolutionized the outdated western higher educational system.

Comment Re:Encountered this kind of thing ... (Score 1) 204

Work pays you because you understand calculus. You pay school to develop that understanding. School and Work are not the same.

You miss my point - school could do a better job preparing you for work.

Because my boss expects me to already understand what I'm doing. The entire reason you go to school is because you don't know, and you want to learn.

There's often times when I don't understand what I'm doing when I receive an assignment and I have to go teach myself something new to get it done.

You are expected to make mistakes and learn from them.

At my job? No - I'm expected to learn whatever I need to get it right the first time. At school? I had several courses where the professor told us straight up front he/she thought tests were stupid and out of touch with the real world and told us the majority of our grade would be homeworks and projects. Have to say I think I learned the most out of those courses, and being expected to get it right on the homeworks was not a hard thing at all - it made you take it more seriously than just assuming you would just bump up your grade at the final.

University is not grade school my friend, "homework" is your "in class quiz" and "practice problems"; its just not in class where its a complete waste of the limited time you have with the prof. I don't know about you, but I got 3 hours of class time a week. Depending on the course, I got another few hours of TA or supervised lab time. That class time was for the prof to explain the material, and answer questions. It would have been idiotic to use that time do practice problems on our own.

Well I had both in-class ones and ones where you submitted electronically before class. It made sure the students read the material before class and the lecture (which focused on the tougher parts of the material) wasn't wasted on the easy stuff. Seemed to work pretty well from my standpoint.

Why? In the real world your boss expects you to do Y, and assumes you already know the prerequisite A,B,C. School is where you learn your A,B,Cs.

Yes, and they can do a better job with teaching you about Y as well.

University a isn't trade school.

And this is precisely the problem. It should be, in part. And if the professors don't want/can't do it, then find better ones that will.

I didn't get a degree in computer science so that I could learn how to deploy Active Directory, how to properly configure Apache for security, or program against a given library/API that's popular "in industry", or learn R. You graduate with a degree in comp sci and you should know how computers work, how compilers work, how networks work, how programming languages work, what prodecural, functional, and object oriented are, you'll know about recursion, you'll know about concurrency and resource locking, semaphores and critical sections, atomic transactions, you'll know about AI, or SQL, you'll know how algorithms work, how stacks works, how fundamental data structures work, how to compute performance characteristics, etc. I use a lot of this knowledge all the time, and even the stuff I don't use I'm aware of and can recognize its applicability when it comes up. So when I get asked whether an R program used on inputs of n=3 or 4 runs in 5 minutes to an hour how big can they go... and I have the tools to analyze it and say its O(21^n) and you can get up to be about 6 on a PC, maybe 9 or 10 on a cloud platform for some $$$. (real example by the way from a couple weeks ago). The algorithm was about as good as it could be, so we analyzed the problem itself and came up with a new way of defining the solution space, that could be evaluated and searched in O(n^3) and give us the results we needed. And new we can go to n=20 and beyond on a laptop. That's what I went to university for.

The stuff industry wants, best practices for coding Java interfaces, deep API knowledge in whatever API they happen use that week (net, java, qt, whatever), how to use Team Foundation Server or git or whatever they happen to use that week, or whatever else (to use programming as an base example), is what you get from doing the job.

Perhaps there should be programming apprentiships and IT apprentiships in much the same way there are for plumbers and electricians, but plumbers and carpenters get PAID ON THE JOB to learn that stuff, and so should IT and software developers.

Industry wants everything for free. Frankly, if industry had its way, university graduates would have studied their employee orientation & policy, ISO procedures, and would already know where to park too.

Well I got a degree in CS to get a job related to CS, period. I really didn't have any expectation as to what I was going to learn, precisely - I trusted the school that they knew what I needed to know. I found out after I got out of school that there were a lot of things related to the computer industry that they didn't even mention once, let alone teach about it. And there were other things that they mentioned but gave you the distinct impression that it wouldn't matter if they glossed over it and get on with "more interesting" topics. Well guess what - interview after interview I kept running into the basic qualification questions. An example is "have you programmed in C++ before?". Sad to say, but somehow in all my coursework (undergrad and grad school), no. Other imperative programming languages, yes, and mostly functional languages like Scheme that no one in industry uses. I knew lots other really cool, advanced CS stuff that their company could use, but unfortunately that's not a substitute for C++ experience (coursework would have been acceptable). Could have I taught myself? Of course, but at that point it's too late. I need to get a job, not learn more. I'm not even talking about "flavor of the month" technology stuff, I'm talking about basic industry knowledge that a) many companies wants b) doesn't change much or too fast. It's not beyond the school to include/emphasize it more, it's just that they don't want to, and they won't change because currently they aren't held accountable to the type of education they provide. Sure, it's great by their standards, but their standards isn't the one people care about at the end of the day.

Comment Re:Encountered this kind of thing ... (Score 1) 204

I disagree.

Thanks to wolfram alpha, stack overflow, and the internet in general, homework is as meaningful or as meaningless as the student chooses to make it. Most professors assigned 5-10% of the grade to homework, and that was usually just for completion, not for correctness.

Homework is an opportunity to practice and develop and understanding of the material being taught.

These statements go against what I have experienced in the real world. The internet and such is the primary source of information for me, even with my text books sitting next to me on my desk. The ability to look up stuff and self-teaching on-the-fly is much more useful than remembering every last thing that was taught to me in school. That's not to say there's no value - being able to do calculus and such by hand is necessary because you need to understand what's going on, but no one is going to pay me to just sit there and do calculus problems by hand all day. I use a calculator, even though I know how to do long division - it's faster and less error prone. I'm not saying do away with testing altogether, but rather use the real world as delimiter of where emphasis of knowledge and compency should really be put, rather than some professor's pie-in-the-sky notions.

Also, you usually don't get "redos" on an assignment from your boss - it's your responsibility to identify if you don't understand something and to ask for help right away, not after he's "graded your work". Once it's on paper (or checked into a repository), it's expected that you've done what you can to get it right the first time. And the professor can always hand out practice problems or in-class quizzes for the competency check, but the major evaluation should mirror what is desired in the real world.

"Projects" are also rife with problems; with much of the grade reducing to grading the student skills at project management itself. That's an important skill too, but not the one that should be getting tested. (Although to be fair, taking tests is also a skill unto itself, but "test taking" is easier to learn to do well than "project management")

And group projects have all kinds of other issues. In my experience the only people who like group projects are people who treat it like free marks some of their classmates will do for them.

They don't have to be group projects, they can be individual ones.

I guess my overall comment here is that because professors have a Ph.D. in their discipline they think that means they know what's best on how to teach. That's not true. My opinion is that everyone who wants to teach at a college or university should have some sort of education training (you know, from the Education Department) to dispel a lot of the silly, off-in-the-clouds notions that many professors think are good teaching practices that people in the Education Department would be quick to squash. They also think that because they are the "higher education pinnacle of society" that they automatically know what the real world needs and that the real world conforms to their ideas of what a person of that discipline should know to be useful and functional. Also not true. It's shocking how not true this is. There needs to be more of a vocational emphasis in higher education, and professors need to teach more of the things that they currently like to raise their noses at, but are the very things that industry constantly complains about with fresh graduates and ends up having to do the teaching themselves on the job.

Comment Re:Encountered this kind of thing ... (Score 1) 204

I understand the difficulty, but in the case of advanced math courses, many of them don't have a lot of students in them, which deflates the argument those who would receive an F in a curve probably would have received and F anyway. IMO, the more advanced the course (especially in grad school), the less tests make sense at all for evaluation. It's better to have projects and homework be the primary evaluator since this is more in line with handling "real world" scenarios - which in theory is the point of giving out grades to begin with.

Comment Re:Encountered this kind of thing ... (Score 1) 204

I see your reasoning on how it factors out the variability amongst professors (which is a good thing), but an even better solution is to change the professors themselves - either by forcing a review on their teaching/evaluation methods, or removing the professor altogether. The job market for academia has been hyper-saturated for quite some time now, so it should rather easy to cycle through them until you find good ones.

Comment Re:Encountered this kind of thing ... (Score 1) 204

Bell curves "work" in academic settings because there's hardly any accountability imposed upon tenured professors for how they evaluate students. It's continually shown how grades (as of right now) are a poor predictor of success in the outside world, yet this continues to be ignored in the practical sense in academia.

Comment Re:I'm confused (Score 2) 172

One problem is that poaching encourages a dichotomous working class system. Poaching is good for employees who have experience, since their wage will go up because companies fight over them, but it's bad for potential employees fresh out of school. No employer wants to be the one to front the capital to train them. They would just rather poach someone that will be effective on day 1.

Comment Re:Young Earth Creationism Considered Harmful (Score 1) 1293

I would say the fact that your morality has been honed over millions of years by natural selection is a pretty good reason to listen to it. Just like how using your lungs to breath is a pretty good idea. And just like the conscious decision to use your biologically evolved lungs to breath, using your morality to make decisions is not even really much of a choice. The guilt of acting against your own moral code is a very compelling force.

Yes but the fact is that many people act according to opposing belief sets. Who's to say what's really right then, if it's based purely on subjective belief? Very compelling, yes, but that does not mean that it couldn't just be done away with, as some people do. And then who's to say that they are wrong for doing so?

I don't think morality needs to be immutable to be morality. Furthermore what could be less immutable than a morality controlled by an omnipotent dictator, that could change his mind about what is moral any time he wants? It seems like new revelations change morality pretty frequently.

In fact I think the naturalistic evolutionary explanation of morality is far more objective than the the "it comes from God" view. For one thing it actually has reasons for being the way it is rather than the non-reason of "It's the way God wanted it".

How is that a non-reason? That either assumes that God doesn't really exist or that God really isn't God (ie. something else takes higher priority). I'm sorry, but changing morality is inherently self-contradictory. Otherwise, you have no business calling the worst crimes in humanity wrong as long as they say "well it's right according to me".

It does not follow the scientific method. It is the starting point of the scientific method. Denying the axiom is unscientific because it denies an axiom of science. There is no point in believing in scientific method if you don't believe what it is based on.

The fruit gained from the scientific method does not require atheism. It still provides insight and the same results within the context of natural phenomena with that assumption in place. I see no compelling reason to think otherwise barring just accepting an authoritarian stance that atheism must come along for the ride.

My gripe here is that the labeling is an authoritative attempt to avoid scrutiny. It's the flip side of what atheists often say against Christians when they inappropriately take for granted that everyone agrees upon the authority of the Bible in a debate (which obviously is not the case).

This is no different than any axiom. If you could scrutinize an axiom and prove it to be true or false, it would not be an axiom. Believing in the Bible is an axiom like believing in a universe governed by physical laws is an axiom. They are no different qualitatively, so I agree with you there. My point is not to show that science is right and religion is wrong. My point is to show that these are incompatible viewpoints.

Disagree. You don't buy into an axiom without scrutinizing it very thoroughly. You just don't scrutinize it deductively. Otherwise it really is FSM, or whatever you like.

Why is my definition of a spirit any different? What difference does it make if the spirit is made out of atoms if the results are functionally the same? Even if you believed in God, why is it so hard to imagine that the physical universe God created is incapable of housing a soul made out of atoms?

My point is the different in using the word "spirit" as a euphemism, or if it is a real entity apart from the physical body.

There are many different definitions and conceptions of what free will is. Some are completely incoherent in any reality. Some are perfectly consistent with a determinism (i.e. compatibalism).

You are right to say that if I assume a purely physical universe that I don;t need an experiment to confirm this assumption. But these experiments were still very shocking to people. I suspect many of these people were trying to hold a dualist view of mind and body, and this experiment challenged that. Not being a dualist myself, I was not all that shocked by the result.

Well I would say to them to examine the assumptions baked into the conclusions, and to separate that from the evidence. I see no reason to be shocked personally - it would seem obvious that the decision-making process itself involve, well, the brain.

Where do assumptions ever come from? If they are based on evidence, then they are not assumptions, they are tentative logical conclusions, not assumptions.

I don't think we get to decide what assumptions we believe in. I think we naturally find some assumptions compelling enough to believe on faith and others not, and it's as simple as that. Sometimes these assumptions can change over time like when people lose their faith or have a religious experience that gives them faith.

I also think your assumptions depend a lot on how you were raised. For example, people in the United states are more likely to believe in the Bible rather than the Quran or Hinu manuscripts. This is clearly not a coincidence.

Indeed not, but that's getting into the dynamics of human decision-making and influence, which is a whole separate discussion. I guess I overall agree what you said, except that I do believe there is a free-will element of "choice" to it - different people decide all sorts of things regarding religion, including to believe that it is nothing worth considering.

Slashdot Top Deals

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...