Become a fan of Slashdot on Facebook


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re:Open.... (Score 1) 286

open system


(Computer Science) computing an operating system that is not specific to a particular supplier, but conforms to more widely compatible standards

Bingo! - The key word in that definition is 'compatible' - which is not the case when you're talking about Microsoft Windows.

More on 'Open Systems' can be found here:

The definition of "open system" can be said to have become more formalized in the 1990s with the emergence of independently administered software standards such as The Open Group's Single UNIX Specification.

Although computer users today are used to a high degree of both hardware and software interoperability, in the 20th century the open systems concept could be promoted by Unix vendors as a significant differentiator. IBM and other companies resisted the trend for decades, exemplified by a now-famous warning in 1991 by an IBM account executive that one should be "careful about getting locked into open systems".

However, in the first part of the 21st century many of these same legacy system vendors, particularly IBM and Hewlett-Packard, began to adopt Linux as part of their overall sales strategy, with "open source" marketed as trumping "open system". Consequently, an IBM mainframe with Linux on z Systems is marketed as being more of an open system than commodity computers using closed-source Microsoft Windows—or even those using Unix, despite its open systems heritage. In response, more companies are opening the source code to their products, with a notable example being Sun Microsystems and their creation of the and OpenSolaris projects, based on their formerly closed-source StarOffice and Solaris software products.

Comment There is little reason to sink (Score 1) 904

The comments here fall into two primary all-or-nothing buckets that seem to be on opposite ends of the political spectrum. Yet when you look closely, it is plain to see that both sides are really talking about the same thing: fear of the unknown resulting from change.

This fear arises because we don't take the time to actually use our minds to think critically from all points of view. Fear paralyzes us - and we take the easy way out - resorting to regurgitating dogma from sources that we identify with our own world-view. We do ourselves and the people around us a disservice when we substitute dogma for thought.

Here is a simple rule to live by - and help you determine if your dogma is in the best interests of everyone. The Golden Rule or law of reciprocity is the principle of treating others as one would wish to be treated oneself. It is a maxim of altruism seen in many human religions and cultures the world over. Now - put yourself in the shoes of the people you are considering in the discussion - and assuming it is you who has to live with the outcome apply the dogma/position that you align with.

Now after doing that thought exercise, if you can honestly say that your position/dogma will not adversely impact others, then it is worthy of consideration. If it cannot, then you need to think about a new dogma.

Comment Re:Patriarchal Society gets a 'Come-up-ins'... (Score 1) 566

The OP - which is me - didn't say that competence was tied to gender, only that gender is not fairly represented.

Given how screwed up the system is today under the last 800,000 years of male rule, my implication is having women in charge could certainly not be any worse.

Comment Repetitive/Common vs. Unique/In Demand (Score 1) 813

If you expect to go into the office every day and do the same thing over and over and get paid a good wage for it in IT - those days are over. Repetitive common tasks are being outsourced because - rightly or wrongly - they are being percieved as not requireing significant skills, and targets for automation.

On the other hand - if you are smart, you will look for unique opportunities and skills that are in demand. This is your opportunity to define yourself in the job market, rather than letting an employer define you.

Security, Data Analytics, and related automation are easy low hanging fruit. Additional areas that you might focus on include data driven AI, robotics, and healthcare. Reference:

Comment Hallelujah! Some Sanity (Score 1) 294

The big push to patent everything that started around the turn of the century lead to not only software patents but also process patents - both of which are evil because they suppress innovation by the larger population, effectively blocking small businesses and individuals (who can't afford patents or to litigate patent disputes) from pursuing their ideas which before this ruling would run the risk of overlapping any number of patents in a web that was impossible in practical terms to identify fully.

Some here have argued that not having patents allows others to take your ideas and benefit from it. But there is nothing that says you have to open source your code. You can keep your code private - in which case others would need to develop their own solutions. To the uninformed that is called 'competition' and is a good thing for the market and your customers.

Overall - the good of being free of patent litigation for software outweighs any good achieved through patents by patent trolls and the litigation we've seen. The efforts in litigation add nothing in terms of competition or creation of new and better products for customers - and is a net drain on everyone except the pocketbook of the lawyers involved. I am happy to see some sanity is starting to prevail on this subject.

Comment Patriarchal Society gets a 'Come-up-ins'... (Score -1, Flamebait) 566

Given that 50 percent of the population is female, yet most jobs and management positions are held by males - a correction is in order. Male dominated leadership and the associated misogynistic world view have lead to centuries of suffering by humanity. Its about time female leadership drives change.

I for one welcome our female overlords.

Comment Re:This again? (Score 1) 401

In various languages, like C or Perl or Python, you can execute a print command and direct what file handle your output will go to. Some standard ones are standard out (aka STDOUT) - which usually goes to the terminal screen, and standard error (STDERR) which also goes to the terminal screen. A standard method when using file handles is to redirect their output to something else other than the screen (probably a good thing to do if you are running a GUI) - like a device such as a printer or text file - or my favorite /dev/null. This is a backwards compatibility with Unix - where everything in the operating system is a file - including devices. If you are running a Unix or Linux OS from a command line shell - you can redirect the standard file handles from the command line using shell redirection parameters.

So technically - if you are using the print commands from any one of these languages - you can indeed use your video card to print your output to the screen. And yes, I do save a lot on paper in my 'paperless' office as a result (though I usually print to a file - so I can peruse it later on a portable device at a later date).

Comment Curation is the key (Score 1) 348

If you are a person concerned about your information being useful/relevant to future generations - then you must curate the data (basically filter the raw data and add context to create information). Raw data of every moment of your lifetime is too much data to be relevant to human beings - although computers may find it useful -- assuming the algorithms and AI used to process it is perfect -=- which isn't likely. Of course, online services don't do this well at all - and there are no guarantees your data will survive the next merger or retirement of the companies behind the services your information is tied up in.

If your information is a program, or the output of a program, then you should build programs that take into account the need to preserve their runtime environment and provide conversion of data to open standards (e.g. xml etc) that can be reproduced easily without the need of a specific program. Virtual machines are an excellent means of doing this over time - and have had success in keeping old console games alive.

Finally - storage technology itself will evolve over time - and now that most things are in a digital form, migrating the data to the new technologies is relatively painless.

Comment Re:This again? (Score 1) 401

"Assembly" is not a programming language...

I think you need to rethink that statement.

The earliest computers were often programmed without the help of a programming language, by writing programs in absolute machine language. The programs, in decimal or binary form, were read in from punched cards or magnetic tape, or toggled in on switches on the front panel of the computer. Absolute machine languages were later termed first-generation programming languages (1GL).

The next step was development of so-called second-generation programming languages (2GL) or assembly languages, which were still closely tied to the instruction set architecture of the specific computer. These served to make the program much more human-readable, and relieved the programmer of tedious and error-prone address calculations.

The first high-level programming languages, or third-generation programming languages (3GL), were written in the 1950s. An early high-level programming language to be designed for a computer was Plankalkül, developed for the German Z3 by Konrad Zuse between 1943 and 1945. However, it was not implemented until 1998 and 2000. - Wikipedia

Comment Re:It Made Me Think the Future is Bright (Score 2) 204

I concur - my own dark sense of optimism was formed at the Age of 2 thru 4 during the initial run of the show. After that, I refilled periodically with reruns...

I think this is what differentiates this 'border' generation (tweeners) - they were at the right age to absorb and appreciate Star Trek deeper than they consciously knew at the time. These are the people holding together the technological world today as the boomers go off and retire not really understanding it, and the generations that have followed never knowing a world without the technology they depend upon - and take for granted every day.

That being said, there are many people doing amazing things to help solve problems, and accomplish the piece parts that can make up a better world when put together. In fits and starts progress is being made - so I can't complain really. I continue to stand by my dark optimism.

Comment Re:Bad Idea #1 (Score 1) 674

Call it whatever you want. Apprentice/Master --- but there needs to be a way to differentiate - and thereby focus the work efforts. Master developers/designers need to be building a cohesive set of tools and a design that the Apprentice uses to get the job done.

An apprentice programmer should never be allowed to lead the design or implementation of a project - I don't care how many years they have with the company. Years of service does not equal quality of skill set. I've seen too many projects destroyed because the wrong people were in key positions in the team - and should not have been. The idea that every programmer is an interchangeable widget is a lie. If you are peddling that 'happy joy and rainbow land' view of the world - then you are part of the problem that I am talking about.

The truth is in the deliverables. Most of those deliverables are nowhere close to being right.

Slashdot Top Deals

"Live or die, I'll make a million." -- Reebus Kneebus, before his jump to the center of the earth, Firesign Theater