Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Slashdot Deals: Prep for the CompTIA A+ certification exam. Save 95% on the CompTIA IT Certification Bundle ×

Comment Re:Pao = Sexist (Score 1) 892

I completely agree. I think it might not be as much that women lack the "core skill" og negotiation. Women are generally perceived as good in social situations.
I think it is much like a choice of attitude. I think it is generally agreed that overall women are not as risk-taking as men. Negotiating pay involves risk - it involves the risk of being turned down, og giving a bad impression of yourself. Of course it also involves benefits, namely a higher pay. This is of course especially true if done wrong. At any rate its an extra effort compared to just blindly accepting the deal you get.
For this reason, I don't think there's any wrong in people negotiating potentially getting a higher pay - be it men or women. It's a personal choice whether you want to run that risk or not. Given the nature of markets, there's nothing strange about individuals being able to earn more by taking on more risk or doing an extra effort. In fact it seems unavoidable.
Given my experience with HR departments etc. I think it is pretty obvious that the "offer" Reddit is going to give is highly unlikely to match the individuals skills. IT is an area where higher skills can give extreme multiples in added productivity, even though this is never shown at the pay slip. So it is notoriously difficult to value someone based on just average tables based on years of eduction, experience etc. Often the true value of an employee can only be assessed some time after hiring. This is where negotiation comes in (and here I'm also talking about post-hiring salary adjustments) , because it allows the higher skilled workers to try and get some compensation for the greatly added productivity they bring, beyond what the "averages" indicate. If Reddit eliminates this aspect, they will either have to give up hiring from this pool (because their initial offers are only around average and everyone knows negotiation isn't possible) or they will have to give very high initial offers, meaning they will overpay a lot of people.
Of the two alternatives, the latter is the safest since you can always get rid of those who you turned out to be overpaying, whereas if the initial offer is too low, you are never going to see the top talent at all.
But besides from that - isn't Reddit just a trivial news web site? It's hardly a technological marvel, any semi-skilled person could quickly code up something like that. I suspect it was done only by a few people initially. Then a few truly skilled people are required to scale it up to handle the high load, but that's it. 5-6 good developers ought to be enough. I don't know how many employees they have, wouldn't be surprised it's in thousands given they have poor enough owners/mangement to higher Ellen Pao as CEO.

Comment Same old story (Score 3, Informative) 266

This story comes up every few years. They are all written off the same template, like how this new generation of tools will allow everyone to write their own apps, and you don't need professional programmers, and here's an example of an 8-year old who made an app in two minutes etc. These stories are written by people who don't have a clue what the working day of a programmer looks like, and imagines something sitting isolated at his desk typing in code all day
The programmer's job is to translate the requirements from other humans into requirements that a machine can understand. For very well-defined domains of applications, it is indeed possible for non-programmers to use tools that can achieve the same thing. That was the same with VB 1.0 or equivalents prior to that. I don't think the scope of possible applications that can be written in this way has broadened much in the decades after. Applications that is just a dialog with very simple logic can be written by drawing the dialog and copy-pasting and adapting small statements etc.
Beyond that there's not been much progress in "auto-writing" applications. The reason is, that the current programming languages are actually fairly efficient and precise ways of explaining to a computer what it should do. The job of the programmer is to translate the "business" requirements into that specification.
Even for a fairly trivial stand-alone program computers can't do this today. Even if that were to happen, consider that much of the programmer's time is not spent writing code within some confined, well-defined environment, just writing line after line. Rather, it is spent understanding the customer-specific requirements, realizing if they are impossible/inconsistent and working out a solution together with the customer, integrating multiple systems and many different levels in those systems using completely different interfaces and paradigms, and figuring out how it is all going to work together.
My experience is that most automatic code-writing and analysis tool stop working when you add this layer of complexity. They may work well for a completely stand-alone Java application with a main() method, but how does it work when you have client code, native code, server code, 10 diff. frameworks, 3rd party libraries some written in other languages, external web services, legacy systems, end-users who think about things in different ways than the internal data models etc, implicit contracts/conventions, leaky abstractions, unspecified time-out conditions, workarounds required due to bug in other systems and difference in end-user setups? The complexity and bugs here is always resolved by human programmers and I suspect it will be that way for a long, long time even if computers manage to be able to write stand-alone programs themselves (which IMHO will be the point where we have something approaching general AI). While you can take individual aspects of these problems and formalize them, the amount of combinations and the diversity in scenarios is so big, that it becomes impractical.
If you have competent programmers, most of the bugs are going to be in these areas, since it is also easy for a competent human programmer to write a stand-alone program that works well on one machine.

Comment Re:Where are those recent AI progress deployed? (Score 1) 688

I was talking about Ai, not IT in general. But even then - in 1994 I wouldn't have crapped out over seeing a 2014 smartphone. Yes it would be much nicer than what was available then, but by no means surprising. There were already color-screen PDA's where you could install apps and I believe some also had phone capability. They were expensive, clunky, slow and due to lack of adoption there weren't that many apps. Clearly it was immature at that point, but it would by no means have been surprising that it would become much better over time (Moores law and all).

Comment Re:They're a resource, not a "problem". (Score 2) 307

Honestly, if you are a "hotshot" and you are sitting next to someone who can barely type a word per minute, and have to wait 5-10 min. for just a few lines of code to be written, that is pretty frustrating and not of benefit to anyone. Sometimes when a colleague asks for my help and is slow in typing etc. I just politely ask if I can type and they always agree, and I can do it 5-10x faster than they can, saving time for both of us. Then they have more time to study the solution at their own pace. That has nothing to do with arrogance, just making best use of the time. Of course some politeness and social skill is required!

Comment Is there enough student material? (Score 3, Interesting) 307

Having a CS degree and having 10+ years of professional experience in industry, it is clear that a significant amount of those taking CS (or related IT/programming-oriented programs) don't really have the qualifications for a CS-career, and even after years of employment are still struggling with rather basic programming tasks and are having problems handling just a few levels of abstraction, which is routinely required in any serious programming. Some of the skills required seem to be an "either you have it, or you don't thing" at least after a few years into a career. The saving grace for them is the good job market (for employees) and the ability to go into more management or PM-oriented roles, or at least very soft CS-roles. That, and the fact that many employers are not able (or make no effort) to truly compare the productivity between different employees, so that the weaker ones are somewhat shielded by the performance of the stronger ones.
With this in mind, it's concerning with this big ramp-up in number of CS-trained individuals. I feel we have been at the bottom of the barrel for some years already. Given that it has been well-known to everyone for many years that IT is one of the easiest areas to find employment in and that the salary is comparatively good, and the constant media focus on smartphones, apps and whatnot, it seems reasonable to assume that most people with just a faint interest and ability in IT would have pursued that path already. With this ramp-up, it seems there's a high risk that the market will be flooded by sub-par candidates and that it will be much more than what the market is already absorbing. The result will be massive unemployment among those newly trained CS-people, who were never meant to study CS to begin with.

Comment Men are more creative (Score 1) 579

While a lot of people would like to think the reverse is true, I'm 100% convinced based on 32 years of experience that men are plain more creative, in practically any area. That doesn't mean that all men are creative, or that all women are not. Some women are definitely more creative than some men. But by and large my observation is true. Men are much more likely than women to be geeks about things and spend hours on improving small details or leaning everything about a subject. Women are only likely to do that if there's a distinct benefit to themselves or their children. Women generally do things to obtain some benefit (put something on their CV, brag to their boss etc.) and rarely for the thing itself which is what I consider the 'geeky' aspect of the way men gets interested in things. Since creativity tends to be fostered by spending a lot of time on a subject, men are far more likely to actually produce something and be creative.
So in short, women don't want to contribute to Wikipedia because the personal gains to them are small. Men do it for the fun of it, for the creative process.

Comment Missing info (Score 1) 177

It would be useful to know how many of the court's decisions are affirm vs reverse. If 70% are affirm, it's not impressive to be able to correctly predict 70% of decisions since you can just always guess on "affirm". Of course, if it is equally distributed (50% affirm, 50% reverse), getting 70% correct shows the algorithm has some prediction power (assuming it is trained on a different dataset than is used for evaluating it). But it is impossible to determine if this is the case, based on the information in the article.

Comment Re:He just described how MPEG works (sort of) (Score 1) 90

I don't think any of what you wrote goes against what I wrote. I have the exact same distinction that some components of MPEG movie compression (and my post applies to MPEG-2 also btw) are lossy, that is compensated for by other steps, but other steps yet again makes it lossy.

Comment He just described how MPEG works (sort of) (Score 1) 90

He came up with the idea of using lossy compression techniques to compress the original file, then calculating the difference between the approximate reconstruction and the original file and compressing that data; by combining the two pieces, you have a lossless compressor.
This type of approach can work, Misra said, but, in most cases, would not be more efficient than a standard lossless compression algorithm, because coding the error usually isn’t any easier than coding the data.

Well, this is almost how MPEG movie compresion works - and it really does work! MPEG works by partly describing the next picture from the previous using motion vectors. These vectors described how the next picture will look based on movements of small-ish macroblocks on the original picture. Now, if that was the only element of the algorithm movies would look kind of strange (like paper-doll characters being moved about)! So the secret to make it work is to send extra information allowing the client to calculate the final picture. This is done by subracted the predicted picture from the actual next frame. This difference-picture will (if the difference between the two frames was indeed mostly due to movement) be mostly black but it will contain some information due to noise, things that have become visible due to the movement, rotation etc. So MPEG also contains an algorithm that can very efficiently encode this "difference picture". Basically an algorithm that is very efficient for encoding an almost black picture.

So there you have it - MPEG works by applying a very crude, lossy compression (only describing the picture difference in terms of motion vectors) and in addition transmitting the information required to correct the errors and allow reconstruction of the actual frame.

The only part where the comparison breaks down is that MPEG is not lossless. Even when transmitting the difference picture, further compression and reduction takes place (depending on bandwidth) so that the actual frame can not be reconstructed 100%. Still MPEG is evidence that the basic principle of using a crude lossy compressor combined with sending a compensation, works.

Comment Potentially carcinogenic (Score 1) 190

Paracetamol/acetaminophen is potentially carcinogenic. It has been shown to inhibit DNA repair (see e.g. http://www.ncbi.nlm.nih.gov/pubmed/7715617 or Google yourself) and has been associated with an increase in various cancers, for instance blood cancers which are often associated with factors that cause DNA damage (see e.g. http://www.ncbi.nlm.nih.gov/pubmed/21555699).
It has also been associated with other cancers. In contrast to this, the alternative painkiller aspirin has been associated with a decreased risk of many different types of cancer, notably colon cancer but also blood cancers as well as many other cancers (too lazy to find all the references). In animals, it can protect animals exposed to other carcinogens from cancer.
Ibuprofen seems to be mixed, in some studies it is associated with increased cancer in other with decreased risk.
By the way, note that aspirin is not a magic bullet against cancer; it is also a blood thinner. People who take it chronically are prone to getting bruises from the slightest hits. This also means it increases the risk of potentially fatal bleeds, as well as certain types of strokes (intercranial haemorrage). On the other hand it decreases the risk of blood clots. The net effect is difficult to assess.

Comment What about his microwave? (Score 2) 278

Where has the microwave gone? I'm not aware of any technological developments since 2005 that has "converged" the microwave into any other device. Also, none of the devices he displays in the 2013 seems able to displace a microwave (unless there's some new app I'm not aware of). Hence, we must conclude that this article merely represents the lifestyle choices made by this particular person, with no relevance to the rest of the population.

Comment The Danish perspective (Score 1) 370

I can add a Danish perspective. In the last 10-15 years, the number of people accepted for the PhD program has ramped up by a factor of 2-3x from an already high level. On some studies up to one third of graduates are accepted into the PhD program. The PhD program is free to the candidate and in fact pays a decent but not in anyway high salary. But to someone who used to live on 1/3rd of that it's a good salary.

The problem is that it used to be that PhD students were the best students of that year, at least in the natural sciences. I remmeber that when I studied only 1 PhD student was accepted every other year, and he/she would always be one who was very clearly (after a 2 min. conversation) smarter than the rest of the students. Of course that person would also have top grades -- and that was when top grades were more scarce that is the case today.

Today it's not like that. Everything from the top students and down to the "joe average" (i.e. someone with half a brain) in the graduate studies usually gets offered a PhD. In fact, many programs have more PhD's than they can fulfill from the pool of candidate students. There's no prestige about getting a PhD anymore. And since the financial crisis came, it's in some way the opposite, in that the PhD is an obvious path to take for someone who couldn't get a better paying job in the private sector.

This state of affairs is extremely expensive to society. It's expensive to provide an extra 3-year education and salary to people who are just average intelligence. Also, the science that comes out of this is mostly bogus. I.e. not really innovative, just buying a lot of fancy equipment and applying what others already found out. Then publishing it with a twist. There has already been some articles out on how little the science produced in academia is contributing to the economy, and there has also been a few news stories about how the "too many PhD's" has lowered the quality. But it still seems most politicians are in the uncritical "more education" mode... so I suspect it will be a few years until anything substantial happens in this area.

Live within your income, even if you have to borrow to do so. -- Josh Billings

Working...