Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re: Got news for you (Score 0, Troll) 209

So everyone who disagrees with your politics is dumb and the best way to support democracy is to have everyone fall in line and vote for the same party. That is some hot savoury troll food you are serving up there.

Maybe you should talk to some people fortunate enough to have been able to leave homelands that prescribe to such philosophies. See how places like Venezuela or Cuba or China or Libya serve their citizens under that kind of democracy that you advocate.

Comment You are wrong... (Score 1) 163

...about the "year of Linux desktop" never happening because it that has already happened, if you look at things from different perspectives.

There is a case to be made that this past year, 2012, is the "year of Linux on the desktop" in a sense. When you factor in mobile devices, in this past quarter, Android computing devices shipped in higher numbers than Windows computing devices (NT-kernel based and otherwise) INCLUDING THE TRADITIONAL PC. So, when you set your phone or tablet on a desk at least, Linux has finally triumphed. One thing is for sure--if it isn't the year of the Linux DESKTOP it is certainly the year of Linux PERSONAL COMPUTING.

Some might say the desktop was conquered by Linux much earlier. When a user sits at a desktop how long is it before that user opens a browser and uses Google to search or Facebook to, well, waste time I guess. Sure, most of those desktops booted into Windows by a large measure, but it was a mere shell--a runtime container for the web app that is Google or Facebook, and both would not be possible without Linux. So at a differnt level, Linux is already master of the desktop.

Still think that the above arguments are valid, and think that a Linux Desktop is nothing less than a full sized PC on a desk that boots into a Linux OS and hndles the full software stack? Well then maybe that is a cause not worth fighting for. Something is pretty evident here: It would be wrong for Linux advocates to dwell excessively on "the desktop". Though "the desktop" will never disappear completely it is clearly a mature, stagnant segment of the global computing ecosystem that is only going to become less dominant over time. As such, conquering the desktop is not the path to conquering the computing world. It might be important to you, but you are not a normal computer user. Indeed, very few slashdotters are. I am certainly not a normal computer user that is for sure. Your comments very much reveal that"

I've been using Linux full time for 5 years (since the Windows Vista calamity) and it wasn't until Ubuntu ruined their distro with Unity that I had to hop to another one

To me it looks like you are a "power user"--comfortable with computers enough but don't cope well with change. My guess is that you were satisfied with WinXP but circumstances forced you onto Vista (your old PC was too outdated to run the more contemporary bloatware, or broke down, etc and you needed a new machine, and they all came with Vista by then). There was definitely a time for many where it was acually easier to obtain and install Linux than get a legal downgrade to XP so you were motivated to go to Ubuntu. Then Ubunti changed (or got "ruined" for you), and so you picked the most conservative one out there--Debian--in an effort to resist change. But you don't have the stomach to use any "unstable" packages to support more recent hardware so you went to the next mode conservative community distro. I do see the pattern here.

There are distros and desktops out there for you. You could go back to Debian--I would suggest "testing" though (don't be scared of the name--by Debian's standards "testing" is more stable than an LTS release of Ubuntu). I can already tell you would NOT like GNOME 3--even if it was brilliant it is too different from that win95 era design pattern for you--so be sure to use XFCE and you will be right at home. Apart from that Linux Mint is another good OS--and Cinnamon or MATE are old-school enough in their design to work for the "traditionalists" among us. I like them anyways...

The puzzling thing is that you spout off all these old problems--can't get metworking going, cant get sound going, can't get video going blah blah. These are not the challenged they were 5 years ago, and even 5 years ago they were not such huge problems aside from wireless and bleeding-edge video chipsets. These days it isn't a challenge to find Linux-friendly hardware--if you have such challenges it indicates you did little to nothing in terms of validating the availabiliy of Linux drivers for your hardware before purchase. The solution for that can be easy--buy an actual Linux PC (System 7, ZaReason, etc) where the vendor has made sure all the important bits work and even install the OS for you--kind of like Apple and all Microsoft vendors do, right?

A couple of other comments I have:

my Mac Mini should be delivered on Monday

I fear you may eventually become disenchanted. The Mac will "just work" because of course the hardware and software are under the full control of a single, rather domineering vendor. Don't be too fast to buy upgraded OSes--Apple is making the move towards a more iPad-like experience recently. Apple has much more control over your experience than you might be used to so you will only be satisfied as long as they do what you like. This is not the case with Linux--the Free software ecosystem is diverse enough that there is someone out there that caters to your tastes more closely.

installing and configuring Oracle Java is a nightmare

That is a characteristic of Java, not the OS on which it is installed. My spouse has a Mac. Oracle's Java is not nice on it either. Java is unpleasant, unstable and insecure and unsuitable for desktop applications now. The only future it has is its legacy on the server side and its non-Oracle descendant Dalvik that forms the basis of the Android userland. Then solution isn't ditching Linux--indeed Java will frustrate you on a Mac too--the solution is to find alternatives to Java applications at all practical costs.

And if you think that people are going to accept a totally stripped-bare 100% pure distro the likes of which Richard Stallman would use, then it's game over (though it's probably been game over for years, now).

Stallman is a very special case that goes way beyond normal and extends to the hardware he uses (he does not use the WWW the way we do, and he goes to extreme lengths to make sure schematics are available for his hardware and so on). He is an idealist and refreshingly honest, if rather disagreeable to many. I am comforted that he goes to the effort he does to practice what he preaches. RMS has made it his purpose in life to defend Free software's ideals, not to commercialise it or maximise market share. That is not to say RMS wouldn't like to see GNU/Linux take over the world, but he has not made that his task. Indeed, if a Linux OS became dominant by abandoning Free software ideals the RMS would be front and centre in the effort to bring it down.

So it is up to the pragmatists to popularise Linux by striking the right compromise--RMS and Shuttleworth act as checks and balances against each other for example. And yes, it is "game over" or nearly so, but as I stated earlier I think Linux has won.

Comment Fixed that for you (Score 1) 447

Either Apple will start fabricating the chips themselves or someone else will.

Apple already designs their application processors themselves. Thery are even more than a mere ARM licensee as well--they are a founding partner of the original joint venture and I believe they still own a significant amount of that company--they have "priviliged access" to the core IP I'd think.

However Apple is considered a "fabless design house". They rely on Samsung to actually build their deisgn, and to source other fabs is not trivial--there are significant startup and logistics issues in establishing such an agreement, and as Samsung is the sole source for Apple AP chips they are over a barrel. Perhaps Apple needs to build or buy a fab to ensure stable secure supply of components. They've pissed off Samsung, who have done the logical thing and raised the price of their goods to Apple because of increased costs--one of them being the need to pay $1billion in litigation expenses incurred by their mobility division. Seems poetic justice/lgocial business decision to recoup the cost from the customer responsible for that expense doesn't it?

Also, Apple had already sent a clear message to Samsung as various supply contracts expired that they were looking elsewhere even before Samsung adjusted the price upwards, and for more commodity-type parts Samsung has lost that business. Just like with your insurance or your cable company, you get deals when you "bundle". Since Apple is not "bundling" anymore Samsung probably feels justified in increasing the prices on remaining business. This is typical business decision, even without considering the lawsuit/rivalry.

Perhaps Apple can talk to another fab--but I bet the IDMs out there will see how Apple treated Samsung and won't want to go there so they need to work with a foundry-only company like TSMC or GlobalFoundries instead. Perhaps that will isolate them from the risk of suing a supplier directly. Nonetheless if Apple were to get TSMC to build their AP chips, then sued the biggest customers for TSMC chips that might not go too well either.

Comment Re:Why would someone buy that? (Score 1) 478

Right... because that worked so well with Apple's control over everyone's tablets and phones. People just "refused to purchase it"

Buyers embraced the iPhone not because of its DRM/closed technology. They bought it because the whole industry SUCKED REALLY BAD. Compared to everythng else in those dark ages the iPhone WAS open. Prior to that phones had crappy UIs and if youwere lucky a handful of stupid, useless mobile-Java apps that your carrier decided you were just barely worthy of using. There was no such thing as a really USEFUL smartphone whein Apple unveiled the iPhone. The closest thing was the Blackberry--indispensible for ite email and BBM capabilities but its ecossytem was (and still is) even more closed than Apple's--and how have they been doing lately? Hardware wise, Apple largely copied and improved upon the designs of PALM and other PDAs and touchscreen "smart" phones (using PalmOS or Windows CE) but these devices had crappy OS and user experience--plus they were locked down so hard by carriers that they were even more useless (side-loading apps was posible but a task too cumbersome for non-technical folks).

When the first iPhone came out, the only "apps" available were glorified web bookmarks--remeber? Plus it couldn't do 3G and in general it was a mediocre device technically, but the touch screen was beautiful and it was easier to use than the competition so it was a moderate success. Once it had a proper app store (ie. it actually "opened up" a little) it became a true hit. Then Android came out. It is a platform OPEN to any manufacturer with OPEN app markets. It came out after Apple was a huge success and had a huge hill to climb. Guess what? 3 of 4 smartphones is now an Android device. The same will happen to the iPad. It already is--Apple's market share, once at Microsft Windows proportions, is sliding down quickly towards 50% and will continue falling. Why? Not because Android or the hardware vendors usin it copied apple, but becasue it is MORE OPEN. Apple will always do well enough but they will ALWAYS eventually end up a significant/infuential but minority player in the market--ALWAYS. Their survival long term will be due to their quality designs and stellar customer service that counter the lack of opennes. If they forget thet they will die, and the best they will ever expect with a closed, vertical model in market share long term wil be 10 to 25 percent at best. Jobs make mac computers insanely great again, and they have not managed to achieve 20% marrket share. Iphones are now below 20%. iPads are heading that way in the next few years too. It is because they are CLOSED.

Now look at MSFT DISKinect. It is intended to act as a DRM apparatus for gaming, premium television, and so on. But that industry is already "more open" (it isn't truly open but consumers do have more freedom due to the lack of ability to enforce draconian licensing terms to this point). Not like smartphones where it was truly and fully closed to start with when the iPhone came out. Nobody buys stuff that deliberately breaks itself more than what they already have EVER. DIVX failed because of that reason, and notwithstanding the granting of patents and whatnot, DISKinect is DOA as a viable product idea--at least for the broad consumer market (if it has any use at all it would be in security applications--to lock and unlock doors when a room hits maximum occupancy for fire regulations, or to sound alarms if the wrong people are in a secure area and what not).

Comment Wow, even more sensible than masking tape! (Score 1) 478

This "prilliant idea" of a patent is not a big deal. If anything I think it is good news that it is now an idea legally encumbered by litigious American patent law--it provides some degree of protection to us all from the temptation to widely adopt the technology!

As a whole the vast majority of patented inventions never become implemented. Sometimes it is because they are defensive patents (more and more these days companies patent any wing-nut idea they have if there is the remotest chance the invention could be implemented in a hit product and be targteted by trolls). Even more commonly however is because the invention is impractical or unworkable in reality or because it is just a stupid idea.

And "Microsoft DISKinect" is a colossally stupid idea, so I would expect any attempt to implement it, at least in consumer-targeter products, to be a bigger fail than BOB. Nobody will buy this--it is repulsive and there is no way to describe it that wouldn't offend someone. Can you imagine going to Best Buy and the sales guy tries to describe this? "It has this cool feature where the TV watches you when you are watching it, so if you all leave it turns off by itself, or if sees more people than are allowed to watch it pauses the show! Isn't that cool? Now you don't have to worry about violating copyright law or forgetting to turn off the TV!"

People would hear "the TV watches you when you are watching it.....wa wa wah wah wahhh....." and think "WTF...my TV is watching me? that is too creepy! Who sees me? I don't want people spying on me! This is exactly big brother! Is this demo unit watching me now? I gotta get out of here!" and maybe even "well that is crappy forget about enjoying my porn with this TV".

Besides the creepiness factor of being watched by unseen people or machines, the whole point of this invention is to make something stop working. To no less than 100% of consumers this is offensive and unwanted. That is why there has literally, with exactly no exceptions, been a copy protection/activation enforcement scheme in history that has not been compromised partially or fully EVER (name any one, and I or someone here can point you to a way to break it--"analogue loophole" being the most common method for multimedia). It must be awful depressing to be a developer responsible for creating dongles and activation apps and such because their purpose in life is to make something that deliberately makes things stop working, and everybody hates your creations and many even go to great efforts to remove or disable your code and/or devices.

Microsoft DISKinect is worthy of scorn and ridicule because it is an instant epic failure like DIVX media. It is"DIVX for the cloud". Remeber DIVX, invented by the folks who watched one too many episodes of Mission Impossible? "This movie disc will self destruct in one week". Wow, just what I always wanted, to buy a little plastic disc with a movie on it that, once I watch it once, starts to physically degrade into a coaster or frisbee fit for the landfill within days. What a waste..and so is DISKinect. And, coming from Microsoft you can bet until V3.0 that it would be cumbersome and unreliable, counting pictures or your kids doll as people and locking the movie, or letting viewers of African extraction watch "for free" (or refusing to turn on in th first place) becasue their complexion is too dark, or software bugs or faulty camera hardware making things stop.

This is a joke people, you should laugh at how stupid MSFT is to think making Big Brother a literal reality would be a good, well received idea.

Comment Car analogies are passe so here is a sex analogy. (Score 2) 194

Preaching that automation systems be kept off the internet is like preaching abstinance until marriage to teens. It sounds like the lgical solution to all the problems but it is unreasonalbe to ever expect it to happen, so the best course of action is to educate on how to do it safely and responsibly.

Ther are many valid reasons that automation systems are connected to the internet in some fashion (though they never need direct internet access). Some of those reasons relate to not braking the law.

In industries like oil and gas, regulators require data to be collected 24/7/365 on all critical aspects of an operation. If an environmental or safety incedent were to happen and such data was not available for scruitiny it could lead to the permanent closure of that operation in extreme cases. Lack of due diligence in such matters can mean huge monetary fines and even jail time for wilful violations.

As such, in those operations a "process historian" server is standard equipment. These are central data logging servers that have essentally full read-only access to the industrila control system, and even some limited write access too (say, to assert a bit in a PLC to confirm it has received data, or to reset a totaliser or set a new batch number). Becasue of how vital the data is, there has to be some way to get the data off-site for archival and reporting purposes, and because of the volume of data and the immediacy that is demanded removable media is not an option. Thus these systems end up with some means of corporate network access. This does NOT mean the need "direct internet" access, but very commonly it means tunnelling through public/internet infrastructure via VPN (the "condom" if you will). Though technological measures can be taken to make this route into the plant impeneratable, it is complex enough to set up that people make mistakes and thus you end up with "holes in the condom".

The other use for outside conenctivity relates to support from off-site engineers, vendors and operators. A control system can be set up to report critical alarm conditions to smartphones, email inboxes and the like automatically with much more rapidness than a human operator at the board can do. The more rapid response to a critical incident the less likelihood for loss of revenue, damage to equipment, and injury or death of workers (again, in the case of "sour sites"--thouse that deal with natural gas containing deadly H2S, rapid response is vital to evacuate the facility and surrounding area and some of these are required by law).

So "preaching abstinence" in the complete absence of "sex education" is a bad idea. It is ineffective to say "disconenct from the internet" and not say how you can manage network security safely and responsibly, because at some point these people will be pressured into doing it and need to be able to "say no" if they aren't ready, and to know when and why it is "the right time", becasue if you DO use that internet connection responsibly it can actually be a great experience ;-)

Comment It's more about lack of knowledge (Score 5, Informative) 194

The CAT5 cable I'd traced about 180 meters across the plant going back into the office internet connection was setup to allow this process to complete, since they had apparently failed to do it earlier when the system was OOTB but not yet hooked up.

Assuming it was all Rockwell/Allen+Bradley gear then it was undoubtedly the FactoryTalk Activation system they were struggling with, and they were undoubtedly unqualified to be doing the work they were assigned to do (disclosure: I am a former Rockwell Automation employee so I have familiarity with the subject, but apart from that I do not speak on behalf of any employer past or present here).

First and foremost, Allen+Bradley(AB) PLCs don't need activations, so the licensing really isn't relevant to this story. AB makes a crap-pile of profit on that hardware the moment they've sold you the box--activation makes no sense. What DOES need to be activated (and is what creates profit for the Rockwell Software division) is the RSLogix programming software, without which the PLC is as useful as a doorstop. So unless they were completely clueless they'd have just taken their laptop into the office and activated their software then come back, rather than break all sorts of IT, security and safety rules stringing out 180m of CAT5 and a spare switch to get internet. The same goes for their drives--the drive units don't need activating but DriveTools software on the programming laptop may have.

That said, there may have been an industrial PC like a VersaView or third-party unit running the Rockwell HMI software and was bolted into the cabinet with un-activated software for some reason, but Rockwell/AB have thought of that...

The legacy licensing system used utility software called "EVMove" and relied on "master disks" (towards the end you could set up a USB flash drive) and in the field this was a royal pain in the ass--floppies and their drives are far too sensitive for such an environment, and USB memory sticks are terrible to manage and secure. Thus the development of the FactoryTalk Activation internet service-based scheme. Though it requires the internet the end system does not need to be connected to activate. The easy "wizard" way sends a "host ID" (the ethernet MAC address or some such number) from the end device to Rockwell via the internet. However, you can actually write down the mac address, or generate the hostID file on the target machine, then go to an internet-connected computer and type the hostID into a secure web form or upload the hostID file. The website then generates a license file that you can save to removable media or a laptop/portable machine to take over to the target machine physically, thus preserving the air gap (and making the method more similar to the old EVMove floppy method).

I do agree that licensing/DRM/activation is a big problem that costs end users millions of dollars globally (above and beyond the actual purchase cost of the products). It adds complication and downtime and confusion and contributes exactly zero value to its users. One might argue about its value to the vendor as well--FactoryTalk activation and many other similar schemes are just as trivial to circumvent as CoDeSys' ladder logic runtime for hackers, and adds the burden of extra support costs from the honest users it keeps honest. But the problem in industrial automation is bigger than that. The problem is that the world in general moves faster than industrial control systems can keep up, and the people who have "experience" honed their skills in the mid 1990s or earlier and haven't kept up. In the meantime, PHBs of the world in management and government demand of them far more than they are capable of delivering.

It used to be that refineries/factories/etc were content with paper chart recorders where operators and plant managers could peruse them if something came up to troubleshoot. Then came data recorders where you could plug in a serial cable or transfer via floppy to a computer for more detailed analysis. It was enticing to have that ability, but that is where control systems' suitability ended, whilst requirements kept getting more demanding. As hard-wired operator boards gave way to computer driven HMIs these industrial controllers became more and more networked and intertwined with PCs. Standalone data loggers and paper chart recorders were suplanted by "process historians" -- centralised data logging systems. With that came the demand for data retention, off-site backup and "real time information" which meant that these historian servers, with full read access (and sometimes write access) to control system networks had to have some path out to corporate networks, which usually have internet accessibility in some fashion. Thus the air gap closes.

It is easy to say "just unplug it from the internet" but the real situation is much more complex. Nowadays, government regualtions mandate the collection, retention and submission of data from many industrial processes so it can be disclosed on demand in the event of an incident. Simply "unplugging" an industrial network in many if not most cases can be considered braking the law! In order to provide sufficient perimiter security and still allow for the collection and off-site retention of his data these facilities have to spend considerable time and money thoroughly developing their IT infrastructure as well as their policies and procedures, and NOBODY from vendors or regulators have provided anything CLOSE to sufficient guidance to those techs and engineers working with 20-year-old knowledge.

The requirement for an "air gap" is impractical at best and impossible at worst--the ultimate solution is to drag the automation industry into the same decade as the rest of IT, and to mandate coverage of industrail IT/networking un the curriculum for engineers and technologists in the field. The scenario described in the parent post sounds absurd, but I've seen both "senior" guys who haven't been able to keep pace and newbies alike do the same kind of silly things.

Comment My needs are simple enough (Score 1) 818

I use to care, but then I spent more time finding the perfect GUI then I did actually doing work.

From my perspective i just want to load up my apps and go. The desktop environment doesn't actually do anything--it is the container for the applications, and the applications are what make the computer useful to me. By an large any of the desktops are adequate in that regard. If a dsektop gets in the way of me using my favourite apps then it is a bad desktop. GNOME 3 has not yet gotten in the way so I am not motivated to switch (in fact, it has done quite well in staying out of my way).

I've generally not had issues with KDE in the past, but then it is much less often the default desktop of distros that I use so I haven't stayed up to date with it. When KDE went up to v4 though it was an utter failure an order of magniture larger than anything I had to deal with on GNOME 3. That is becasue KDE 4 really REALLY got in my way of doing that I wanted to, which is let me launch my apps and havigate amongst them with ease. The lack of speed and stability was absolutely appaling on my gear, and so off to GNOME I went. Since then GNOME and xfce have been quite adequate for my needs and I've become familiar with that architecture. The only thing I've really added was the "advanced settings" app and extension (aka gnome tweak tool). That has filled in whatever holes in configurability that I've had--and loking at the slowly growing number of extensions available it looks like the potential is there to do much more with the environment that way. I like that approach better than the "put everything PLUS a kitchen sink or two" right into the desktop.packages like KDE seems more apt to do.

Besides just being happy with what I got on my Debian Testing install by default, I just have this unquantifiable impression of KDE that makes me think it isn't really the future. It seems to represent the "old ways" to me and I like to try different things. KDE on the desktop at least is a bit to much of a "windows derivative"--perhaps that will work well for them when Metro becomes the only offering from Microsoft though.

Comment Re:One year of Harper (Score 1) 129

Saskatchewan is solid blue because the federal ridings are gerrymandered to hell and gone.

The Conservative party had ZERO to do with this--blame Cretien-era Liberals since they had the most clout in the last review of electoral divisions. That is when Saskatoon and Regina were carved up and lumped in with vast swaths of surrounding prairie. It appears to have been a strategy to ensure Liberal cabinet ministers like Ralph Goodale had more opportunity to be competitive mostly at the expense of the NDP. For example, Regina as a city is pretty left leaning, so it was carved up and Mr. Goodale was in one of those ridings (wascana which takes in a corner of Regina but a big rural area too).

The whole province is set up a bit weird now though--it is almost ANTI-gerrymandered...most of the ridings are 3-way races nowadays and the whole province could look different from one election to the next. Conservatives got the most vots and the most seats though--one time in BC the Liberas "won" with less votes than the NDP but more seats. Now, THAT smells like a problem.

Comment Re:One year of Harper (Score 1) 129

This is the damage done by one year of Harper.

So why is this "interesting" and not "offtopic" or "troll"? How is the release of a policy document from a Hollywood-industry lobby group the result of a conservative government policy? The parent comment to yours mentioned that though not perfect, the current copyright bill was significantly toned town through both public pressure and (believe it or not, I'm sure you won't due to your blind partisanship) genuine concern from those within the Conservative party. "Damage done"? If anything this is evidence of "disaster averted"...for now. Obviously the Hollywood lobby is not happy with the resulting copyright bill and has published this policy document to demonstrate what they will be pushing for (far) beyond what the government has accepted.

Liberals and their supporters have no ground to stand on in this issue either--it was under a Liberal regime that copyright bills were twice introduced that were fairly more onerous than what is now before the House.

We've three more years of hell before we'll be rid of him, even though his government is illegitimate and does not have a real majority because of the robocall scandal.

So now we head from troll to off-topic. There is no connection at all between how the last election was conducted and what an industrial-lobbyist reports as their copyright policy in any way at all. The government is no less legitimate than Cretien's was in 1997. In fact, Cretien's was less legitimate because he got a lower percentage of the popular vote but his party won MORE seats, and Cretien's government was tainted by major scandal and was notoriously dishonest during its campaigning as well. Scandal knows no ideology, and it is quite very obviously clear your disatisfaction is ideological in nature and you are mostly upset because you'd very much rather everyone agreed with you and voted NDP (what is the point of the Liberals anyways--the party is dead and should be killed or subsumed by the NDP siince they are basically the same now--just call it the Liberal-Democrats).

The NDP have never run a federal government so it is easy for them to be critical and have immunity to scandal, but look at their track record at other levels of government--just as often as they are quietly competent and responsible (Manitoba, Saskatchewan) they can be disastrous, inept and/or corrupt (BC, Ontario). There is NO WAY WHATSOEVER that one could make a convincing argument that an NDP government would be able to get away with rebuffing ACTA completely, ignoring the need to update/reform copyright or implement copyright policy completely counter to the rest of the world and would be able to do so without some consequences on trade relations, much less think that if Mulcair became PM that lobbyists would just magically go "oh no Tom is PM! I guess we should just give up!". Could they do better? Maybe, maybe not. The US tried to vote for big change and got Obama and for all the "hope" he campaigned on, big business is still well cared for, big copyright lobbiests still have bi-partisan influence, the economy still struggles, the debt still looms large and on and on.

Some things are bigger than "ooh those nasty Tories". Copyright policy doesn't neatly follow partisan lines and though it is federal juristiction it is quite obviously influenced heavily on an international scale by industries whose business models are heavily dependent on legislatively-protected monopolies. Replacing the "fascist" Tories for the "communist" NDP won't stop these kinds of policy documents or the pressure by the lobbyist authors of them from perstering governments. Like cockroaches the lobbyists will always be there and the only thing that will keep them at bay is the public/electorate at large becoming more civic-minded and politicaly active to counter--regardless of ideology/political leanings. (I know I know, the NDP aren't really communist and some of their more strident members might argue they're barely socialist anymore, but calling the Tories fascist is just as offensive and inaccurate and an insult to those who have had to endure REAL fascist governments. In fact some might say the Tories are barely even "conservative").

Comment Re:User key management (Score 4, Insightful) 437

How? Most reasonable mechanisms that could be envisioned would likely be considered an 'attack vector' in certain scenarios. I'm genuinely curious as to the mechanisms allowed for end-user key management in this sort of system.

Secure boot specification describes three "modes" of operation:

1) standard: Accept software signed only by keys included in the factory BIOS (ie. Microsoft-issued keys)
2) custom: Accept software as in 1) but also allow keys signed by another authority/the user. This allows the user to flash in their own key and spin their own Linux/BSD/alternative OS and sign it so it will work with secure boot. NOTE you would also need custom mode in Windows 8 if you are employing custom or in-house drivers or other software that talks too closely to hardware.
3) setup(?): Seems to be a special mode--I think it is a one time setting that changes back after reboot? The setup mode is so that your software installer--an alternative OS or a driver in Windows or otherwise, would be able to push its key into te system's firmware during the install process so you don't have to do that step in the UEFI setup manually. Once a key is installed from a software setup process the system would revert to custom modefor subsequent boots.

Besides that UEFI secure boot can be disabled entirely so you can run unsigned system software and none of the above would matter.

The deal with Red Hat and the Devil (um, the evil Microsoft one not the cute FreeBSD one) commits Microsoft to distributing keys signed by them to anyone who ponies up $99 and fills out the requisite forms. In return you get a key to sign your own OS or other privliged software (drivers/kernel modules...) issued through a Microsoft CA that will work in mode 1) above. That is, you can create a distro or driver setup disk that will work with a "factory default" UEFI setting.

I personally have no problems with this scheme except for two critical points:

1) Microsoft alone is the caretaker (cert. authority) for ALL standard keys. This constitutes a monopoly. Monopolies are not illegal but using them to supress potential competitors IS illegal, and this arrangement sets up Microsoft with the ability to get into amti-competitive shenanigans (again). The $99 fee is not a problem--there is no expiry on your key and you can sign all your stuff with it--I may get one for my own business should I run into issues with custom mode or disabled secure boot. A BIG problem is that nothing commits them to being honest with the CAs. There isn't going to be just one root cert form Microsoft, and nothing stops them from using a "special" certificate class for the $99 certs. That would let them revoke all of them "killswitch" style for whatever reason (the root gets compormised, or they just don't like what they keys are being used for), so anyone who does a bios update or gets a new machine would be SOL if MSFT doesn't re-issue you a new key and won't take another $99 from you.

2) Microsoft is not being platform agnostic. There is ARM and "everything else". MSFT has decreed that ONLY standard mode is permitted on ARM devices that have Windows installed--NO custom or setup modes and NO disabling of secure boot. Furthermore I am not sure if the $99 keys will work to build software for ARM devices (anyone know that one? MSFT could issue certs that only work on x86 architecture if they wanted to). You cannot get a shiny "built for Windows 8" sticker (who cares really) and it is against the license agreement to even install on "insecure" ARM hardware (THAT is something to care about). MSFT is (currently) an inconsequential player in mobile/ARM space so there isn't a big risk yet. However, they could leverage their desktop monopoly to push Windows 8 slates and smartphones in the enterprise and even elsewhere (smart glass in the home for example) and if they are successful it would entice vendors to lock out custom OSes.

Regulatory authorities are going to have to keep a close watch on how MSFT conducts itself as sole steward of secure boot keys. The system should be more federated in my opinion, modeled after how root cert lists are distributed with browsers from a number of security vendors even in Free browsers. The DOJ in the US for example should be prepared to issue a consent decree involving MSFT divesting itself of issuing secure boot certs because there is a clear potential conflict with a dominant software vendor being in charge of keys that control what software runs on typical hardware.

Problem 2) should be addresses ASAP. ARM and/or its licensees must not tolerate being treated like this and Microsoft should be forced to remove all architecture-specific stipulations from its secure boot spec. The difficulty is that the culture of the mobile industry is toxic--all ARM powered consumer mobile devices are locked down to some extent. Apple is very heavy handed and even most Android vendors, in cahoots with wireless carriers, will lock up their phones tight, forcing users to use adulterated, crapware-laden and often out-of-date Android derivatives of their own design. For them firmwre lockdown is business as usual so very little inducement is required on MSFT's part to get vendors to lock down ARM hardware.

To me, secure boot is too heavy handed, though the concept is sound. There should be no "standard mode" really--only "runtime" and "setup" mode. You could have a POST screen that says "press DEL to enter setup mode" then throw in your install disk which puts in whatever keys you want as part of the OS install through an industry-standard API. Setup mode could be used to update CRLs or add new keys for new driver vendors or whatever. The whole "standard mode" with Microsoft controlled keys betrays the real reason for secure boot--not elimination of viruses but closed software vendor's control over hardware and end users.

Comment The article is about a *KDE release*... (Score 2) 134

...yet the article poster's commentary, the first post and most of the discussion threads seem to focus on GNOME 3 and other competitors.

I gave up using KDE pre-4.x and haven't really given it serious consideration since the reansition to 4. I know I've missed a lot of changes, improvements and so on so I am profoundly disappointed with the tone of discussion here. Not only is the first readable post an anti-GNOME troll/joke, at the time I read this it is rated "informative". What in God's name are the moderators here smoking? It is a JOKE--yes I do find it very funny but it is 100% information-free. Let's hear about what people LIKE about what is in KDE and what is coming. The signal-to-noise ratio is low even by /. standards of late.

If KDE designers rely so much on "user feedback" then where is the user feedback here? Is the only place to go for a lucid discussion of this desktop environment somewhere in the bowels of the kde.org website? If that is the case then they aren't really getting effective user feedback--they are getting "fanboi feedback", which over time might prove just as effective as "rodent pets [whispering] great design ideas to the developers". What does it have to offer to people to make it worth migrating from GNOME, or Unity or Xfce or LXDE or Enlightenment? In the case of ALL those alternatives if you talk to advocates they will lead off with the virtues of their chosen platforms (GNOME extenstions are a really cool concept, Xfce and LXDE are lightweight and present a familiar interface, Enlightenment brings eye candy to lower powered devices and so on).

Well, sorry, flaming GNOME is about the least effective way to get converts. When GNOME 3 came out and a good chunk of the user base went looking, KDE was NOT where they went. By and large, they chose to try Xfce over KDE and even to fork GNOME 2 or re-spin GNOME 3 with extensions. That users looking for an alernative would "re-invent the wheel" over flocking to KDE speaks volumes about either the lack of awareness of KDE or the lack of appeal of KDE itself, doesn't it?

Do KDE fanbois really not remember when KDE 4 came out how dreadful it was? It is essentially the SAME discussion that GNOME 3 detractors are having right now, right down to Linus Torvalds himself totally slamming it and publicly abandoning it for another desktop! And guess when so many people stopped really considering KDE? So if KDE 4 really IS so much better now that it is up to 4.9 please elabourate out there KDE fans. Stop telling me how much GNOME 3 sucks. I KNOW how much it "sucks" because I USE it (consequently I know how much it *rocks* in other ways). Well me why on my older desktop I should stop using Xfce...is KDE really an option there? Xfce seems like a nice transition for "GNOME 2 refugees" like me. Is KDE 4.9 going to be disruptive? If KDE is "too different" from GNOME 2 then I might as well keep using GNOME 3, or try MATE or Cinnamon or use Xfce more often. What are KDE's merits these days?

The only real impression I get is that KDE is a "tweaker's desktop"...that it is very featureful and can be tailored to do many things. However this doesn't convince me because:

* GNOME 3's extentions capability seems to have all the potential to enable tweaking a user needs (POTENTIAL...if/when the library of exensions grows it could be really good)
* I am not a desktop-tweaker. A dektop is to me an app launcher, consequently I don't miss many features. I kinda-sorta miss minimise buttons in GNOME 3 but I actually am surprised how rarely I actually used them (and there are extensions/settings to enable them anyways if you really need them). About the only things GNOME 3 still seriously need are some tuning of the default user experience and an "expert mode" control panel for those who DO like to tweak a bit more or for people to initially set up their environment in a new install. Overall I do not want to be aware of the desktop--I want it to stay out of my way.

So, is my opinion of KDE skewed? Is it (still?) geard towards tweakers? How is the "just out of the box" experience--how much personalisatoin does a typical KDE user feel they have to do? Has it gotten more efficient with resources since the one time I tried KDE4 in the dark days of the earlier releases when it consumed everything it could find?

In any case, in the spirit of this discussion on a new KDE relase, I shall now provide my opinion of GNOME 3:

It is a skinky old piece of lindburger cheese that I dispise...sorry, just got caught up in the moment I guess....I don't actually feel that way...

I actually think GNOME 3 is simply going through the same painful re-invention process KDE 4 went through. Moreover, I think they are actually moving faster to address shortcomings than KDE 4 devs seemed to take. I do NOT see it as being excessively biased against desktop users and I find myself falling into some convenient habits like using the hot corners (and being frustrated when I go on another computer and the "activities" hot corner doesn't work). I also like some of the little extensions you can get to "fix things" (the best of which will hopefully make their way into the default setup..once can hope). On the "needs improvement" side there is a lack of "manual override" stuff. For example, If you cannot autodetect a printer the add-printer facilities are broken in Debian (you need to call up from command line the "classic" dialogue or else you get a cryptic error--but I've only once had to add a printer manually--they always autodetected except in the one case). System settings are not extensive enough without extensions either and the multi-monitor operation needs to be refined a bit.

But it really isn't THAT bad! I think I won't give up as easily as I did on KDE...firstly because Linux OSes are about choice and we can't all just flock to the "least bad" choice until it is the only choice, and secondly because it is stable and fast enough to still be usable, whereas my first day with KDE 4 was my last becasue it was so slow and unstable on my hardware it wouldn't even function (desktop getting in the way in a BIG way).

Comment Re:Wonderful Support... (Score 1) 627

You have ANY idea how many millions of dollars is made in sales each year in part by some VB+Access DB? Hell I've even built a few of 'em myself and last I heard they are all still running, doing what they are supposed to do. And that's just the home grown apps, do you have ANY idea how many small, say 5-10 man, software houses there are out there writing for Windows?

I have been involved in that market as well with some ov my old work out there having served for a dozen years, and it is indeed a part of the industry that escapes the attention of most observers. VB+Access is to a nemotode worm as the Excel "Database" is to an amoeba (in other words, it is just one step up from the lowest form of IT on the evolutionary scale). It is unsophisticated and lacks robustness and scalability, but it serves its purpuse just well enough that its small business and departmental users are not motivated to change. Sad as it is, such applications are often "mission critical" to these smaller niche operations. People would be really surprised at how much certain segments of daily business rely on some of these crufty old systems.

Incidentally, right there longside the VB+Access apps are the Foxpro apps, which are even older and crustier in some cases...

All she had to do was typethe first two letters of what drug they were on and a drop down popped up that she could just tap and fill in the blank [...] Nope because i doubt seriously you find any software in Linux that is as highly specialized as nurses charting programs and even if you could you'd have to pay someone to transfer all that damned data and for what? What would they gain?

The gain, if Free or open source solutions are adopted, is the assurance that you own and control your data. This is not a sophisticated program, even if it is targeted to a very specific niche. It is probably a VB+Access variant too. Re-engineering this or transferring data out would be trivial. I have done such things with those prox card building access systems' software, though not to port it but rather to integrate with it. The barriers are not the specialised nature of the software (because the software is actually quite, um, "basic"), it is the deliberate barriers that the vendors put in to lock in their customers. I've seen this many times--they password-lock their access databases, obsfucate their database scemas and data, rename the .mdb file to a different extension to divert attention from what it really is and so on. But even so, it is rarely insurmountable to unlock your data in some form or another, so long as there haven't been some contractual/legal lock ins imposed by the integrator or vendor (the perils of adopting closed software aren't merely technical lock-in--they can be legal too).

It amazes me that so many in the Linux world complain of the "Windows tax" and act like 'free as in beer' is a selling point when honestly? For most the price of Windows isn't even in the top 5 of their expense report. If you look at Windows having a 10 year support cycle (which is now standard on ALL versions of Windows) that is $8 a year for Windows home (unless you buy the family pack, then its just $4) and $14 a year for Windows pro....THAT is supposed to be high? hell most of my customers, most of my family even, spend more on stupid crap in a week than Windows costs per year.

That has to be the most misleading, ignorant and crude TCO analysis that I've ever seen! "Retail license cost divided by number of years of vendor support" is NOT THE COST OF WINDOWS. It isn't even the MONETARY cost of Windows! What about the ongoing support costs? You need applications, you need anti-virus ad security licensing and support (there are support costs to MSE even if it is free, even if it is just time), there is the cost of hardware upgrades because new versions of Windows have NEVER done well on 10 year old hardware--even WinXP after 3 service packs and using modern applications is far more resource intensive than the initial release of XP and the software that ran on it at the time of its release. The contention that Windows costs less than $15/year is just as ludicrous as saying Linux costs nothing!

Linux is compelling in the server room because MSFT MAKES it compelling, by having insane EULAs and crazy license requirements like per user CALs. If MSFT wanted to wipe Linux out in the server room they could simply offer WinServer at $300 and no user CALs but they make so damned much money off of server its not worth picking up the low end sales to them.

If you really have worked in this industry space for awhile then you'd KNOW that smaller and more specialised business with in-house developed or "boutique VB+Access" apps face a lot of the same kind of total crap! I've mentioned the pseudo-barriers put in to lock out competition. I've seen the insane license and support costs of those specialised applications and know that they are just asking for competition from more open alternatives. The thing is that the lock-in tactics are even more effective in this space than they are on the servers. That nurse with her tablet or the clerk at the mall or the self checkout you use at the supermarket or the ATM at your bank or the shift supervisor at te receiving dock... they all suffer from the lock-in they unwittingly choe to accept in the last 15 or 20 years. in NONE of those applications does Windows have anything to offer--these are often systems with NO tightly-coupled integration with enterprise systems, go throught great pains to completely hide the Windows desktop, and the users have no need to run Photoshop or the latest 3-d first-person shooter or Quickbooks or tie into an ACtive Directory group policy system. These are "applicane apps", and nobody would know or care if it was Windows or Linux or OS/2 or DOS even that was underneath.

Well, at least nobody but those who supply or support those systems. And don't think it is the customer that has demanded a MSFT based system--that is not the concern it once was because a) there is more awareness of Linux and other platforms, and b) the customer does less and less in-house support, relying on the vendor/integrator/developer for that. Niche business apps for departmental or SMB use are just that far behind the enterprise on the IT curve if you really look at it--behind and still moving slower and so losing ground. Linux started in 1991 and I worked on my first commercial/business use of it on servers in 1997, and it was considered "radical" but is was quite a success, and it took a couple more years even after that for Liunx to make serious inroads beyond web serving--so that is six years from the beginnnings to "early" adoption by large business. Change can take upwards of a decade.

In small business and in specialised applications things are even further behind. These are the kind of outfits that have a LAN but maybe not even a proper file/print server, where Win XP is still on a good proportion of their machines--two versions behind and going on three. It was a big step to go from pen and paper and clipboards to spreadsheet software, and it was a big milestone to invest their hard earned money in that special VB+Access app 10 years ago. It could take up to five years more before they make another big move. There IS opportunity for Linux and other Free software solutions out there. There are a few of those 10-year-old-plus stale old VB and Access stuff that are ripe for replacement and that market is growing. There does have to be considerable effort to unseat incombant MSFT-partner vendors out there.who have those established customer relationships (and have probably graduated to using the whole .NET Visual Studio toolchain). But in some cases the small vendors/integrators are now defunct, or they are not all that tight with MSFT and may even be willing to offer more open solutions on different platforms.

the desktop is the exact opposite, they have economies of scale so large that they can sell their product cheap as hell and still make billions. While i actually like Linux in the web server and embedded roles there is simply no real selling point for Linux on the desktop.

There are HUGE selling points for Linux on the desktop, but you have to be looking at the right desktops! How about "counter tops"--as in point-of-sale systems? There is a small liquor store chain where I live that already uses a Linux based solution--the store managers appreciate the lack of prevalance of viruses on the systems and that they are pretty low maintenance and DONT run Windows software/games. What about desktops in factories, refineries and warehouses, where they run some kind of Java- or web-based front end to some ERP applications or specialised data gathering tools? Windows holds no advantage of familiarity because such desktops almost NEVER sit with the normal desktop environment exposed. These are the places where Windows and its ubiquitousness are actually more of a liabiliity becasue of all the precautions against viruses and having to put in extra effort to lock machines down (prevent use of removable media, install of apps and so forth).

There are selling points for Linux, but they must be matched with the irght end users.

Comment Re:Have You Accounted for User Preference? (Score 1) 204

If your entire office is on LibreOffice, I can see it working well within the office, but once you start sharing documents with external partners, I'm really surprised you've had zero problems.

It seems that the requirement to share complex documents with external recipients in Microsoft Office format is rapidly declining in my experience. In my work I am more and more discouraged to share word documents for example--it is preferred that those be kept internally and that they be put into PDF format to be sent out. The same goes for spreadsheets in many cases too. When such documents ARE widely distributed they tend to be fairly crude or simplistic--Excel spreadsheets that are basically checklists or contain very simple machros and formulae.

I suppose it depends on the industry in which you work. If you have huge macros from hell, strange embedded and linked objects and blinking, beeping, rotating animated logos festooning your office documents then you will have real problems. The problems have not been insurmountable for me personally. I suppose it is the 80-20 or 90-10 rule. The vast bulk of users won't have huge issues. Those that do tend to be "special cases". And, in those special cases if they are distributing such documents they are NOT well recieved by external recipients quite often--even if they are all-MSFT shops. If you have huge macros or special Excel add-ins, or really uncommon embedded or linked objects or are using other arcane features they can break even on other MSFT systems--whether it be some dependency not being installed or IT policies restricting what macros can do or whatever.

It is not too much to ask people sending you documents to send them as PDF, or warn them that for security and interoperability reasons that such nasty complex documents cannot be accepted or may not be fully accessible--at least for the vast majority of users, becasue really, of an Office file is so complex that it is unusable in LibreOffice you are even inviting trouble for users in other MSFT setups too in terms of differing versions, patch levels, installed software and so forth. Complex and/or extremely large MS Office document files are a sign that the wrong tool is being used for the job and it is time to invest in a real solution (Excel is not a database, or a project management system or an enterprise reporting engine--using it as such is akin to using a butter knife as a screwdriver or a stilletto heel as a hammer--it functions but not well at all and it will eventually break from the abuse).

Comment Re:Whatever happened to Perl 6? (Score 1) 192

This scares me. The fact that there are two compilers (which sows confusion) and none is feature complete

Why is that so scary or "confusing"? There are a lot of compilers for C and C++ out there, and they aren't all "Feature complete" (MSFT's C++ compiler for example is not fully C99 compliant and they don't seem in much of a hurry to make it so). Doesn't seem to me to indicate that a language or platform should be discounted.

... and it looks like Parrot was dropped.

Can you give a citation showing this? A new release of Parrot came out just last week. It looks far from dropped--in fact it looks like a VERY actvie development community--they seel to put out a new release every month or so! Also, the latest Radoko compiler for Perl6 is only about a month old itself.

I won't argue the point that the dev timeline is somewhere between "duke nukem forever" and "GNU HURD" in terms of release to market timelines. The devs do seem to be pursuing development of Parrot and Perl6 more in terms of an academic or research exercise than in putting out production code. However it is a complete misstatement to say Parrot has been dropped.

Slashdot Top Deals

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...