Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Comment Good stuff (Score 1) 248

This is brilliant. Hope my company adopts this as quickly as possible. I don't have time to read time wasting work-related mails at my job. In case you missed it it's the season and I have my hands full doing on-line shopping and hunting down coupon codes. I already hardly have any time left to read the frickin' news sites. And I guess if you think your mail is so important, just put a request at the bottom to consider forwarding it to the next member of the department or project team, so each person who receives it can make a balanced decision whether to bother a next person with your mail, that interferes with other priorities.

Comment Global warming, terrorism (Score 1) 242

threats of GW are 'similar to and in many cases greater than those posed by terrorist attacks.'

I get it. In the name of fighting terrorism and now GW, the people are stripped from their civil rights, liberties reduced and taxes increased.

Just as with religion, GW and terrorism are represented as a great threat and oppressive measures follow soon after.

This is a time in history to start paying attention.

Comment Re:Another Apple blunder (Score 2) 288

That you get 275,000 apps that work ono the iPad mini.

Which ones, old, pre-retina iPad apps or newer retina iPad apps that will look like shit if they work at all?

This device should have had a retina resolution if it was to exist at all. Now it is just an enlarged iPhone. Only few developers will target this device specifically. It will be a niche device. I can see it being used by waiters for taking orders on a busy terrace. But as a media consumer or internet front-end device it will fail. The screen is a huge step back compared to all other iOS devices except the iPad 2. If Apple really believed in a device of this size, it should have had a retina resolution.

Comment bad apple (Score 1) 288

The iPad mini is a device with a flawed screen. Everything will just seem bigger because of the lower ppi. Very very bad design decision. Sure, it can pre-Retina iPad apps. But what about retina iPad apps? They will look like shit if they run at all.

Apple should have kept the retina resolution. Sure, the display would be new in terms of number of pixels, but at least the number of ppi would be more consistent. Everything would appear comparable in size. Now, on the iPad mini, everything will seem larger while at the same time the display is smaller than an iPad.

This will just not fly. Developers will be reluctant to adapt their apps for this device. Lack of apps will result in low sales. This device will FAIL.

Comment Intelligence and racism (Score 0) 213

We people are not created equal. There are obvious, undeniable visible differences between races such as skin color and facial features. There are also undeniable physical differences between races. Dark people generally more easily develop muscle tissue and are stronger. The world record holder of the 100 meters sprint is and likely always will be a black person. Lighter skinned people have more ability to abstract, invent and plan ahead, skills that contribute to a persons intelligence.

In this time and age of political correctness we do not wish to label an entire race as being "less intelligent" than the other. It feels wrong to label dark skinned people as "less intelligent" than whites. This desire to treat all men as equal with respect of intelligence is exactly the root of the problem. Intelligence so highly regarded in our society that it has become the most important attribute by which we value a person. We see intelligence as a highly desirable property of a human being and the lack of it is looked down upon. Saying that darker skinned people are on the average less intelligent than lighter skinned people is synonymous with saying that dark skinned people are insuperior to light skinned people. Obviously this very wrong and exactly this narrow view is what makes any research on the relation between race and intelligence very uncomfortable and controversial. Scientifically speaking, intelligence is just another inheritable property just like traits such as length, hair color, eye color and of course skin color.

Is there an explanation for light skinned people to be on average more intelligent than dark skinned people? Perhaps there is. To put it really simplistically - in the jungle, whenever you get hungry, you hunt down and kill an animal and you will eat. To survive it is crucial to be fast, strong, agile, and as long as that makes it possible to survive, having the ability to plan or invent is only a small advantage towards survival and creating more offspring. If however the environment becomes more challenging, for instance away from the tropics, there will be seasons to deal with. Food will not be as abundant. Planning ahead for food (for instance by farming) will now be a crucial advantage as well as the ability to design and create tools is. Building proper shelter is more challenging but when done well, again greatly increases chances on survival. In general, further north where the environment is more challenging, the people that planned and invented will have created by far the most offspring. And for some reason skin color changed from dark to light on our path towards the north, making the difference in intelligence a difference in traits, i.e., appearance.

Racism is not the problem, because there are races. However, generalizing is. Regarding any "white" person to be more intelligent and hence superior to any "black" person is a generalization and rightfully offensive and upsetting. There are plenty stupid white people and intelligent black people around to disprove that.

Also I haven't discussed Asians which are in some aspects more intelligent than Caucasians. I haven't discussed jews - some of the biggest scientists ever to have lived on the planet were jews or of jewish descent. Perhaps jews are capable of reaching the highest levels of abstract thinking.

But if we would turn back the clock 100,000 years and be back in the jungle, the ability to run away from a tiger and climb a tree might be a better asset than having an IQ of 105.

Comment clueless? (Score 1, Informative) 418

"I'm mainly a VB.NET person with skills from the .NET 2.0 era."

Implied are .NET 2.0 skills. Taken literally however, .NET 2.0 skills are not confirmed by this statement.

Why this unclear statement? I will conveniently jump to conclusions and say: this person is a mediocre developer having only done some VB.NET stuff and can't make the jump to .NET. Has nothing to do with age.

Comment Not a good programmer (Score 3, Informative) 767

When is someone a programmer? I wrote my first programs on a calculator. They were more like macros actually. Was I a programmer? Of course not. Then I wrote my first BASIC program on an Apple ][ of a friend at highschool. Was I a programmer? Not really. Then I saved up all my money and got myself a C64 and wrote programs in BASIC, then 6510 assembly. Was I a programmer? Well, perhaps, but only 15, so what did I know? A couple of years later I bought myself an Amiga 500. Wrote some stuff in 68000 assembly. When studying computer science, I learned a lot of useless program languages, but also C. Wrote lots of programs in C. Then I started a small company, hired an office space where 10Mb ethernet sockets from the wall connected directly to the net for a low fee, built and hosted web sites on a Intel 80486 running Linux. This was 1995. When I got my first job at an internationally operating start-up, I was busy configuring servers running NT, load balancers, firewalls but also did some SQL and coded some Cold Fusion for the company web site. My old trusty 486 served as DNS server. Was I a programmer? Nah, I did not really consider myself one.

The start-up went nowhere and I moved on. I did, and still do, enjoy programming tremendously. I sometimes still do it in my free time as a hobby. So I got a new job and with this job I could program all day. I made long hours that did not feel like long days at all as I was doing some very nice things, or at least that's what I thought. I was making enhancements to core parts of the software, and even got multithreading working for them, something that they were not able to because of compiler bugs, which I also helped finding. I was refactoring their code at high speed, because there was a lot of room for improvement, to say it politely. I often stared with disbelief and some amusement at the nonsensical functional designs handed to me. But worse, I started to clash with their main programmer, who had been there for a long time, and did not like what he saw. Our manager did not extend my contract after a year. He did not like it either. I was using object oriented techniques which they were not used to, it was a "different paradigm" for them, as the manager put it.

This was a disillusion. Programmers at the time were hard to find, and I could not believe that this was happening to me. Was this manager clueless? Probably. Was their main programmer pulling my leg? Perhaps. But I was sure I had done some very valuable things for them and as a reward, I was thrown out. Apparantly, I had been unable to demonstrate my abilities sufficiently. That might have been either my or their shortcoming, but for me that did not matter. I decided to abandon programming, or rather, developing. I felt developing did not receive the respect it deserved. It was often looked down upon by management and being outsourced to India. I decided to become a business analyst.

Life as a business analyst was a walk in the park compared to programming. I could now make designs on a higher level, but with my technical background, also talk to the guys that were going to implement it. I would never hand over a design that the developers would be unable to build. Also, the deadlines where less pressing. In the cycle design-develop-test-release, the time pressure existed mainly in develop and test. The testers would be the ones making extra hours when a release deadline was to be met.

I had been a business analyst for a couple of years at several banks. They have large systems and a high rate of IT staff turnover. Generally at banks, knowledge it sparse, documentation often non-existent, and management not competent on a technical level. They do have enough money though so they just bring in loads of consultants. So being a consultant I benefitted handsomely financially as well. My days as a programmer that got no love were soon forgotten by just looking at my bank account every now and then. I worked happily with the Indian vendor (Infosys) who created just horrible code, but ultimately made sure it worked somehow. And if they didn't, it was not my problem. Bliss!

Currently I am moving back to a more technical role, but one that is very specialist and not so easily outsourced, and always in demand. This means I can keep on contracting, earn a lot of money, have some independence, and for the first time in years I get to do some technical stuff again that I liked. The best of both worlds.
Back on-topic. Can anyone become a programmer? Yes, or at least, anyone can call himself a programmer. I have seen people calling themselves "HTML-programmers" which is embarrassing enough. I was programming non-trivial programs in assembly at the age of 15 but did not consider myself a programmer, or rather a developer, because it entails so much more. Developing efficiently requires a very good understanding of the bigger picture. To be able to see the bigger picture, you have to know about data structures, design patterns, designing databases. You have to know what is efficient and what is not. You always have to think ahead - you don't want to paint yourself into a corner with your code. You have to know about the libraries that are at your disposal so you don't reinvent the wheel. You must know and understand what has already been coded, which can be very difficult. You must know which techniques to know when. Know about middleware and how and when to use it. Then you have to keep up with new developments - new programming environments, new languages (Java, C#, an updated version of C++), new libraries (LINQ for C#), new domains (smartphone apps). Above all, you have to be able to think at a very high level of abstraction if you want to be really succesful. That takes brains, a lot of brains, more brains than most people have.

Therefore I am of the opinion that although everyone can call himself a programmer, not everyone can really be a programmer. But that the ones who are, should ask themselves the question what the point is of being one. The job is lowly regarded, often pay is decent but nothing spectacular, and your job is always at risk of being outsourced to the lowest bidder by clueless management who see cheaper ways. You will have to keep up with new developments to not loose your market value and that takes time and energy. If you do a good job, do not expect to get any compliments as it will often go completely unrecognized.

It is a sad state of affairs in developing. Good luck to you if you are a programmer. You will so need it.

Comment wrongly formulated (Score 4, Insightful) 147

This seems obvious to me, but bills like this should be formulated in terms of what they actually do, regardless of the technology used.

In this case, the bill should simply state that a warrant is required when someones location is actively monitored within a certain precision for a certain time period.

Same with laws around cookies, which is a topic among lawmakers in some countries. Instead targeting cookies, these laws should address the fact that a user is uniquely identified across sessions and/or websites. Cookies are just one way to achieve this, but there are others which do not even require cookies, such IP number in combination with all sorts of data such as browser agent, os, screen resolution etc. etc. that makes any user pretty much unique even without cookies.

Comment Devolution (Score 1) 374

We're not evolving, we're devolving. Before contraception and abortion existed, successful males would reproduce at a much higher rate than now. Morality aside, it is not hard to imagine the successful alpha-male impregnating lots of pretty (read: having good genes) girls and creating much more offspring than the less wanted males.

This is a thing of the past for a couple of generations now. We are living in the genetically unhealthy situation where highly successful males produce only marginally more offspring than regular dudes. It must be feared that for even the maintenance of the quality of our genes requires alpha males to reproduce at a significantly higher rate. Now that this is no longer happening, the quality of our genes will only but degrade and quickly too - in a matter of a couple of hundreds of years we'll see the effects, whatever they will be. Most likely it will start with us getting dumber and more reliant on medical care.

Comment GW (Score 4, Insightful) 1181

Without taking a position whether or not global warming is caused by human activities:

- There is a complete industry now that exists by the grace of the belief that GW is man-made and we can do something about it. This is business having an interest in governments and public believing we should reduce CO2 emissions.
- Being a GW denier is silly. However try taking the position that GW is not entirely man-made, or that GW will not be as damaging as to justify billions of investments. You will get attacked almost in the way blasphemists were attacked in the middle ages. You are a non-believer, and you should go along with the "common believe" and "consensus", what we all think. How dare you disagree? But science is not consensus based. One experiment is all it takes to create new insights, models, theories.

I feel frustrated by governments taking GW as an excuse to raise taxes and increase influence on everyones personal life whenever they can. For instance, banning the light bulb - just how stupid is that?

Slashdot Top Deals

Programmers used to batch environments may find it hard to live without giant listings; we would find it hard to use them. -- D.M. Ritchie