(Computer Science) computing an operating system that is not specific to a particular supplier, but conforms to more widely compatible standards
Bingo! - The key word in that definition is 'compatible' - which is not the case when you're talking about Microsoft Windows.
More on 'Open Systems' can be found here: Wikipedia.org:
The definition of "open system" can be said to have become more formalized in the 1990s with the emergence of independently administered software standards such as The Open Group's Single UNIX Specification.
Although computer users today are used to a high degree of both hardware and software interoperability, in the 20th century the open systems concept could be promoted by Unix vendors as a significant differentiator. IBM and other companies resisted the trend for decades, exemplified by a now-famous warning in 1991 by an IBM account executive that one should be "careful about getting locked into open systems".
However, in the first part of the 21st century many of these same legacy system vendors, particularly IBM and Hewlett-Packard, began to adopt Linux as part of their overall sales strategy, with "open source" marketed as trumping "open system". Consequently, an IBM mainframe with Linux on z Systems is marketed as being more of an open system than commodity computers using closed-source Microsoft Windows—or even those using Unix, despite its open systems heritage. In response, more companies are opening the source code to their products, with a notable example being Sun Microsystems and their creation of the OpenOffice.org and OpenSolaris projects, based on their formerly closed-source StarOffice and Solaris software products.
The comments here fall into two primary all-or-nothing buckets that seem to be on opposite ends of the political spectrum. Yet when you look closely, it is plain to see that both sides are really talking about the same thing: fear of the unknown resulting from change.
This fear arises because we don't take the time to actually use our minds to think critically from all points of view. Fear paralyzes us - and we take the easy way out - resorting to regurgitating dogma from sources that we identify with our own world-view. We do ourselves and the people around us a disservice when we substitute dogma for thought.
Here is a simple rule to live by - and help you determine if your dogma is in the best interests of everyone. The Golden Rule or law of reciprocity is the principle of treating others as one would wish to be treated oneself. It is a maxim of altruism seen in many human religions and cultures the world over. Now - put yourself in the shoes of the people you are considering in the discussion - and assuming it is you who has to live with the outcome apply the dogma/position that you align with.
Now after doing that thought exercise, if you can honestly say that your position/dogma will not adversely impact others, then it is worthy of consideration. If it cannot, then you need to think about a new dogma.
The OP - which is me - didn't say that competence was tied to gender, only that gender is not fairly represented.
Given how screwed up the system is today under the last 800,000 years of male rule, my implication is having women in charge could certainly not be any worse.
If you expect to go into the office every day and do the same thing over and over and get paid a good wage for it in IT - those days are over. Repetitive common tasks are being outsourced because - rightly or wrongly - they are being percieved as not requireing significant skills, and targets for automation.
On the other hand - if you are smart, you will look for unique opportunities and skills that are in demand. This is your opportunity to define yourself in the job market, rather than letting an employer define you.
Security, Data Analytics, and related automation are easy low hanging fruit. Additional areas that you might focus on include data driven AI, robotics, and healthcare. Reference: http://www.modis.com/it-insights/infographics/top-it-jobs-of-2017/
The big push to patent everything that started around the turn of the century lead to not only software patents but also process patents - both of which are evil because they suppress innovation by the larger population, effectively blocking small businesses and individuals (who can't afford patents or to litigate patent disputes) from pursuing their ideas which before this ruling would run the risk of overlapping any number of patents in a web that was impossible in practical terms to identify fully.
Some here have argued that not having patents allows others to take your ideas and benefit from it. But there is nothing that says you have to open source your code. You can keep your code private - in which case others would need to develop their own solutions. To the uninformed that is called 'competition' and is a good thing for the market and your customers.
Overall - the good of being free of patent litigation for software outweighs any good achieved through patents by patent trolls and the litigation we've seen. The efforts in litigation add nothing in terms of competition or creation of new and better products for customers - and is a net drain on everyone except the pocketbook of the lawyers involved. I am happy to see some sanity is starting to prevail on this subject.
I for one welcome our female overlords.
In various languages, like C or Perl or Python, you can execute a print command and direct what file handle your output will go to. Some standard ones are standard out (aka STDOUT) - which usually goes to the terminal screen, and standard error (STDERR) which also goes to the terminal screen. A standard method when using file handles is to redirect their output to something else other than the screen (probably a good thing to do if you are running a GUI) - like a device such as a printer or text file - or my favorite
So technically - if you are using the print commands from any one of these languages - you can indeed use your video card to print your output to the screen. And yes, I do save a lot on paper in my 'paperless' office as a result (though I usually print to a file - so I can peruse it later on a portable device at a later date).
If you are a person concerned about your information being useful/relevant to future generations - then you must curate the data (basically filter the raw data and add context to create information). Raw data of every moment of your lifetime is too much data to be relevant to human beings - although computers may find it useful -- assuming the algorithms and AI used to process it is perfect -=- which isn't likely. Of course, online services don't do this well at all - and there are no guarantees your data will survive the next merger or retirement of the companies behind the services your information is tied up in.
If your information is a program, or the output of a program, then you should build programs that take into account the need to preserve their runtime environment and provide conversion of data to open standards (e.g. xml etc) that can be reproduced easily without the need of a specific program. Virtual machines are an excellent means of doing this over time - and have had success in keeping old console games alive.
Finally - storage technology itself will evolve over time - and now that most things are in a digital form, migrating the data to the new technologies is relatively painless.
"Assembly" is not a programming language...
I think you need to rethink that statement.
The earliest computers were often programmed without the help of a programming language, by writing programs in absolute machine language. The programs, in decimal or binary form, were read in from punched cards or magnetic tape, or toggled in on switches on the front panel of the computer. Absolute machine languages were later termed first-generation programming languages (1GL).
The next step was development of so-called second-generation programming languages (2GL) or assembly languages, which were still closely tied to the instruction set architecture of the specific computer. These served to make the program much more human-readable, and relieved the programmer of tedious and error-prone address calculations.
The first high-level programming languages, or third-generation programming languages (3GL), were written in the 1950s. An early high-level programming language to be designed for a computer was Plankalkül, developed for the German Z3 by Konrad Zuse between 1943 and 1945. However, it was not implemented until 1998 and 2000. - Wikipedia
Your two statements are contradictory.
They're not. Holding a copyright on a work does not confer one with complete authority as to how that work may be used. The rights which comprise copyright are relatively few; further, they are themselves limited in a number of respects.
For example, copyright on a book does not include a right to prohibit other people from reading the book. The list of exclusive rights that together form a copyright can mostly be found at 17 USC 106. (Again, only for the purposes of US copyright law; I have no idea about foreign copyright law, and I don't care to)
And posting a picture on your website doesn't tell or demonstrate anything.
The conduct of doing so, assuming a website open to the public, is an implicit license to anyone to access and view it (and to make incidental copies in the process of doing so).
If I happen to know that the Mona Lisa hangs in the Louvre, there's nothing wrong with my telling people to go there to see it. And if I happen to know the URL of your picture, there's nothing wrong with my telling people to go there to see your picture; this is so whether I provide people with a link to be manually followed, or an embedded link to be automatically followed such that the picture appears in the web page. I'm not copying it onto my website or anything.
First sale is not profiting in a commercial sense.
It is absolutely that. A used book store will sell copies of works for a profit, because it is a commercial enterprise. It is totally reliant on the first sale doctrine. Ditto however many independent video stores still exist (since it's perfectly legal to rent lawfully made copies of movies that you own).
Commercial use is not fair use.
Well, where the hell were you when the Supreme Court needed your input in 1994 in Campbell v. Acuff-Rose Music?
There the Court not only found that a commercial use certainly could be a fair use, they even said that it is wrong to treat a commercial use as being presumptively unfair. Commerciality is just an element to be considered, and that's all:
If, indeed, commerciality carried presumptive force against a finding of fairness, the presumption would swallow nearly all of the illustrative uses listed in the preamble paragraph of Â 107, including news reporting, comment, criticism, teaching, scholarship, and research, since these activities "are generally conducted for profit in this country." Harper & Row, supra, at 592 (Brennan, J., dissenting). Congress could not have intended such a rule, which certainly is not inferable from the common-law cases, arising as they did from the world of letters in which Samuel Johnson could pronounce that "[n]o man but a blockhead ever wrote, except for money." 3 Boswell's Life of Johnson 19 (G. Hill ed. 1934).
But then I guess you already knew everything you wrong was wrong since you fell the need to try and make your point using an insult.
'Everything you wrong was wrong?' What the hell is that?
Anyway, I called you an idiot because you're clearly an idiot. It had nothing to do with my actual argument. But my advice to you is that you have no idea what the hell you're talking about, at least within the context of US copyright law, and you would do yourself, and everyone else a great service if you'd shut the fuck up and learn something from a legitimate, neutral source before you next presume to talk about it.
I concur - my own dark sense of optimism was formed at the Age of 2 thru 4 during the initial run of the show. After that, I refilled periodically with reruns...
I think this is what differentiates this 'border' generation (tweeners) - they were at the right age to absorb and appreciate Star Trek deeper than they consciously knew at the time. These are the people holding together the technological world today as the boomers go off and retire not really understanding it, and the generations that have followed never knowing a world without the technology they depend upon - and take for granted every day.
That being said, there are many people doing amazing things to help solve problems, and accomplish the piece parts that can make up a better world when put together. In fits and starts progress is being made - so I can't complain really. I continue to stand by my dark optimism.
Do molecular biologists wear designer genes?