Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Wow, even more sensible than masking tape! (Score 1) 478

This "prilliant idea" of a patent is not a big deal. If anything I think it is good news that it is now an idea legally encumbered by litigious American patent law--it provides some degree of protection to us all from the temptation to widely adopt the technology!

As a whole the vast majority of patented inventions never become implemented. Sometimes it is because they are defensive patents (more and more these days companies patent any wing-nut idea they have if there is the remotest chance the invention could be implemented in a hit product and be targteted by trolls). Even more commonly however is because the invention is impractical or unworkable in reality or because it is just a stupid idea.

And "Microsoft DISKinect" is a colossally stupid idea, so I would expect any attempt to implement it, at least in consumer-targeter products, to be a bigger fail than BOB. Nobody will buy this--it is repulsive and there is no way to describe it that wouldn't offend someone. Can you imagine going to Best Buy and the sales guy tries to describe this? "It has this cool feature where the TV watches you when you are watching it, so if you all leave it turns off by itself, or if sees more people than are allowed to watch it pauses the show! Isn't that cool? Now you don't have to worry about violating copyright law or forgetting to turn off the TV!"

People would hear "the TV watches you when you are watching it.....wa wa wah wah wahhh....." and think "WTF...my TV is watching me? that is too creepy! Who sees me? I don't want people spying on me! This is exactly big brother! Is this demo unit watching me now? I gotta get out of here!" and maybe even "well that is crappy forget about enjoying my porn with this TV".

Besides the creepiness factor of being watched by unseen people or machines, the whole point of this invention is to make something stop working. To no less than 100% of consumers this is offensive and unwanted. That is why there has literally, with exactly no exceptions, been a copy protection/activation enforcement scheme in history that has not been compromised partially or fully EVER (name any one, and I or someone here can point you to a way to break it--"analogue loophole" being the most common method for multimedia). It must be awful depressing to be a developer responsible for creating dongles and activation apps and such because their purpose in life is to make something that deliberately makes things stop working, and everybody hates your creations and many even go to great efforts to remove or disable your code and/or devices.

Microsoft DISKinect is worthy of scorn and ridicule because it is an instant epic failure like DIVX media. It is"DIVX for the cloud". Remeber DIVX, invented by the folks who watched one too many episodes of Mission Impossible? "This movie disc will self destruct in one week". Wow, just what I always wanted, to buy a little plastic disc with a movie on it that, once I watch it once, starts to physically degrade into a coaster or frisbee fit for the landfill within days. What a waste..and so is DISKinect. And, coming from Microsoft you can bet until V3.0 that it would be cumbersome and unreliable, counting pictures or your kids doll as people and locking the movie, or letting viewers of African extraction watch "for free" (or refusing to turn on in th first place) becasue their complexion is too dark, or software bugs or faulty camera hardware making things stop.

This is a joke people, you should laugh at how stupid MSFT is to think making Big Brother a literal reality would be a good, well received idea.

Comment Car analogies are passe so here is a sex analogy. (Score 2) 194

Preaching that automation systems be kept off the internet is like preaching abstinance until marriage to teens. It sounds like the lgical solution to all the problems but it is unreasonalbe to ever expect it to happen, so the best course of action is to educate on how to do it safely and responsibly.

Ther are many valid reasons that automation systems are connected to the internet in some fashion (though they never need direct internet access). Some of those reasons relate to not braking the law.

In industries like oil and gas, regulators require data to be collected 24/7/365 on all critical aspects of an operation. If an environmental or safety incedent were to happen and such data was not available for scruitiny it could lead to the permanent closure of that operation in extreme cases. Lack of due diligence in such matters can mean huge monetary fines and even jail time for wilful violations.

As such, in those operations a "process historian" server is standard equipment. These are central data logging servers that have essentally full read-only access to the industrila control system, and even some limited write access too (say, to assert a bit in a PLC to confirm it has received data, or to reset a totaliser or set a new batch number). Becasue of how vital the data is, there has to be some way to get the data off-site for archival and reporting purposes, and because of the volume of data and the immediacy that is demanded removable media is not an option. Thus these systems end up with some means of corporate network access. This does NOT mean the need "direct internet" access, but very commonly it means tunnelling through public/internet infrastructure via VPN (the "condom" if you will). Though technological measures can be taken to make this route into the plant impeneratable, it is complex enough to set up that people make mistakes and thus you end up with "holes in the condom".

The other use for outside conenctivity relates to support from off-site engineers, vendors and operators. A control system can be set up to report critical alarm conditions to smartphones, email inboxes and the like automatically with much more rapidness than a human operator at the board can do. The more rapid response to a critical incident the less likelihood for loss of revenue, damage to equipment, and injury or death of workers (again, in the case of "sour sites"--thouse that deal with natural gas containing deadly H2S, rapid response is vital to evacuate the facility and surrounding area and some of these are required by law).

So "preaching abstinence" in the complete absence of "sex education" is a bad idea. It is ineffective to say "disconenct from the internet" and not say how you can manage network security safely and responsibly, because at some point these people will be pressured into doing it and need to be able to "say no" if they aren't ready, and to know when and why it is "the right time", becasue if you DO use that internet connection responsibly it can actually be a great experience ;-)

Comment It's more about lack of knowledge (Score 5, Informative) 194

The CAT5 cable I'd traced about 180 meters across the plant going back into the office internet connection was setup to allow this process to complete, since they had apparently failed to do it earlier when the system was OOTB but not yet hooked up.

Assuming it was all Rockwell/Allen+Bradley gear then it was undoubtedly the FactoryTalk Activation system they were struggling with, and they were undoubtedly unqualified to be doing the work they were assigned to do (disclosure: I am a former Rockwell Automation employee so I have familiarity with the subject, but apart from that I do not speak on behalf of any employer past or present here).

First and foremost, Allen+Bradley(AB) PLCs don't need activations, so the licensing really isn't relevant to this story. AB makes a crap-pile of profit on that hardware the moment they've sold you the box--activation makes no sense. What DOES need to be activated (and is what creates profit for the Rockwell Software division) is the RSLogix programming software, without which the PLC is as useful as a doorstop. So unless they were completely clueless they'd have just taken their laptop into the office and activated their software then come back, rather than break all sorts of IT, security and safety rules stringing out 180m of CAT5 and a spare switch to get internet. The same goes for their drives--the drive units don't need activating but DriveTools software on the programming laptop may have.

That said, there may have been an industrial PC like a VersaView or third-party unit running the Rockwell HMI software and was bolted into the cabinet with un-activated software for some reason, but Rockwell/AB have thought of that...

The legacy licensing system used utility software called "EVMove" and relied on "master disks" (towards the end you could set up a USB flash drive) and in the field this was a royal pain in the ass--floppies and their drives are far too sensitive for such an environment, and USB memory sticks are terrible to manage and secure. Thus the development of the FactoryTalk Activation internet service-based scheme. Though it requires the internet the end system does not need to be connected to activate. The easy "wizard" way sends a "host ID" (the ethernet MAC address or some such number) from the end device to Rockwell via the internet. However, you can actually write down the mac address, or generate the hostID file on the target machine, then go to an internet-connected computer and type the hostID into a secure web form or upload the hostID file. The website then generates a license file that you can save to removable media or a laptop/portable machine to take over to the target machine physically, thus preserving the air gap (and making the method more similar to the old EVMove floppy method).

I do agree that licensing/DRM/activation is a big problem that costs end users millions of dollars globally (above and beyond the actual purchase cost of the products). It adds complication and downtime and confusion and contributes exactly zero value to its users. One might argue about its value to the vendor as well--FactoryTalk activation and many other similar schemes are just as trivial to circumvent as CoDeSys' ladder logic runtime for hackers, and adds the burden of extra support costs from the honest users it keeps honest. But the problem in industrial automation is bigger than that. The problem is that the world in general moves faster than industrial control systems can keep up, and the people who have "experience" honed their skills in the mid 1990s or earlier and haven't kept up. In the meantime, PHBs of the world in management and government demand of them far more than they are capable of delivering.

It used to be that refineries/factories/etc were content with paper chart recorders where operators and plant managers could peruse them if something came up to troubleshoot. Then came data recorders where you could plug in a serial cable or transfer via floppy to a computer for more detailed analysis. It was enticing to have that ability, but that is where control systems' suitability ended, whilst requirements kept getting more demanding. As hard-wired operator boards gave way to computer driven HMIs these industrial controllers became more and more networked and intertwined with PCs. Standalone data loggers and paper chart recorders were suplanted by "process historians" -- centralised data logging systems. With that came the demand for data retention, off-site backup and "real time information" which meant that these historian servers, with full read access (and sometimes write access) to control system networks had to have some path out to corporate networks, which usually have internet accessibility in some fashion. Thus the air gap closes.

It is easy to say "just unplug it from the internet" but the real situation is much more complex. Nowadays, government regualtions mandate the collection, retention and submission of data from many industrial processes so it can be disclosed on demand in the event of an incident. Simply "unplugging" an industrial network in many if not most cases can be considered braking the law! In order to provide sufficient perimiter security and still allow for the collection and off-site retention of his data these facilities have to spend considerable time and money thoroughly developing their IT infrastructure as well as their policies and procedures, and NOBODY from vendors or regulators have provided anything CLOSE to sufficient guidance to those techs and engineers working with 20-year-old knowledge.

The requirement for an "air gap" is impractical at best and impossible at worst--the ultimate solution is to drag the automation industry into the same decade as the rest of IT, and to mandate coverage of industrail IT/networking un the curriculum for engineers and technologists in the field. The scenario described in the parent post sounds absurd, but I've seen both "senior" guys who haven't been able to keep pace and newbies alike do the same kind of silly things.

Comment My needs are simple enough (Score 1) 818

I use to care, but then I spent more time finding the perfect GUI then I did actually doing work.

From my perspective i just want to load up my apps and go. The desktop environment doesn't actually do anything--it is the container for the applications, and the applications are what make the computer useful to me. By an large any of the desktops are adequate in that regard. If a dsektop gets in the way of me using my favourite apps then it is a bad desktop. GNOME 3 has not yet gotten in the way so I am not motivated to switch (in fact, it has done quite well in staying out of my way).

I've generally not had issues with KDE in the past, but then it is much less often the default desktop of distros that I use so I haven't stayed up to date with it. When KDE went up to v4 though it was an utter failure an order of magniture larger than anything I had to deal with on GNOME 3. That is becasue KDE 4 really REALLY got in my way of doing that I wanted to, which is let me launch my apps and havigate amongst them with ease. The lack of speed and stability was absolutely appaling on my gear, and so off to GNOME I went. Since then GNOME and xfce have been quite adequate for my needs and I've become familiar with that architecture. The only thing I've really added was the "advanced settings" app and extension (aka gnome tweak tool). That has filled in whatever holes in configurability that I've had--and loking at the slowly growing number of extensions available it looks like the potential is there to do much more with the environment that way. I like that approach better than the "put everything PLUS a kitchen sink or two" right into the desktop.packages like KDE seems more apt to do.

Besides just being happy with what I got on my Debian Testing install by default, I just have this unquantifiable impression of KDE that makes me think it isn't really the future. It seems to represent the "old ways" to me and I like to try different things. KDE on the desktop at least is a bit to much of a "windows derivative"--perhaps that will work well for them when Metro becomes the only offering from Microsoft though.

Comment Re:One year of Harper (Score 1) 129

Saskatchewan is solid blue because the federal ridings are gerrymandered to hell and gone.

The Conservative party had ZERO to do with this--blame Cretien-era Liberals since they had the most clout in the last review of electoral divisions. That is when Saskatoon and Regina were carved up and lumped in with vast swaths of surrounding prairie. It appears to have been a strategy to ensure Liberal cabinet ministers like Ralph Goodale had more opportunity to be competitive mostly at the expense of the NDP. For example, Regina as a city is pretty left leaning, so it was carved up and Mr. Goodale was in one of those ridings (wascana which takes in a corner of Regina but a big rural area too).

The whole province is set up a bit weird now though--it is almost ANTI-gerrymandered...most of the ridings are 3-way races nowadays and the whole province could look different from one election to the next. Conservatives got the most vots and the most seats though--one time in BC the Liberas "won" with less votes than the NDP but more seats. Now, THAT smells like a problem.

Comment Re:One year of Harper (Score 1) 129

This is the damage done by one year of Harper.

So why is this "interesting" and not "offtopic" or "troll"? How is the release of a policy document from a Hollywood-industry lobby group the result of a conservative government policy? The parent comment to yours mentioned that though not perfect, the current copyright bill was significantly toned town through both public pressure and (believe it or not, I'm sure you won't due to your blind partisanship) genuine concern from those within the Conservative party. "Damage done"? If anything this is evidence of "disaster averted"...for now. Obviously the Hollywood lobby is not happy with the resulting copyright bill and has published this policy document to demonstrate what they will be pushing for (far) beyond what the government has accepted.

Liberals and their supporters have no ground to stand on in this issue either--it was under a Liberal regime that copyright bills were twice introduced that were fairly more onerous than what is now before the House.

We've three more years of hell before we'll be rid of him, even though his government is illegitimate and does not have a real majority because of the robocall scandal.

So now we head from troll to off-topic. There is no connection at all between how the last election was conducted and what an industrial-lobbyist reports as their copyright policy in any way at all. The government is no less legitimate than Cretien's was in 1997. In fact, Cretien's was less legitimate because he got a lower percentage of the popular vote but his party won MORE seats, and Cretien's government was tainted by major scandal and was notoriously dishonest during its campaigning as well. Scandal knows no ideology, and it is quite very obviously clear your disatisfaction is ideological in nature and you are mostly upset because you'd very much rather everyone agreed with you and voted NDP (what is the point of the Liberals anyways--the party is dead and should be killed or subsumed by the NDP siince they are basically the same now--just call it the Liberal-Democrats).

The NDP have never run a federal government so it is easy for them to be critical and have immunity to scandal, but look at their track record at other levels of government--just as often as they are quietly competent and responsible (Manitoba, Saskatchewan) they can be disastrous, inept and/or corrupt (BC, Ontario). There is NO WAY WHATSOEVER that one could make a convincing argument that an NDP government would be able to get away with rebuffing ACTA completely, ignoring the need to update/reform copyright or implement copyright policy completely counter to the rest of the world and would be able to do so without some consequences on trade relations, much less think that if Mulcair became PM that lobbyists would just magically go "oh no Tom is PM! I guess we should just give up!". Could they do better? Maybe, maybe not. The US tried to vote for big change and got Obama and for all the "hope" he campaigned on, big business is still well cared for, big copyright lobbiests still have bi-partisan influence, the economy still struggles, the debt still looms large and on and on.

Some things are bigger than "ooh those nasty Tories". Copyright policy doesn't neatly follow partisan lines and though it is federal juristiction it is quite obviously influenced heavily on an international scale by industries whose business models are heavily dependent on legislatively-protected monopolies. Replacing the "fascist" Tories for the "communist" NDP won't stop these kinds of policy documents or the pressure by the lobbyist authors of them from perstering governments. Like cockroaches the lobbyists will always be there and the only thing that will keep them at bay is the public/electorate at large becoming more civic-minded and politicaly active to counter--regardless of ideology/political leanings. (I know I know, the NDP aren't really communist and some of their more strident members might argue they're barely socialist anymore, but calling the Tories fascist is just as offensive and inaccurate and an insult to those who have had to endure REAL fascist governments. In fact some might say the Tories are barely even "conservative").

Comment Re:User key management (Score 4, Insightful) 437

How? Most reasonable mechanisms that could be envisioned would likely be considered an 'attack vector' in certain scenarios. I'm genuinely curious as to the mechanisms allowed for end-user key management in this sort of system.

Secure boot specification describes three "modes" of operation:

1) standard: Accept software signed only by keys included in the factory BIOS (ie. Microsoft-issued keys)
2) custom: Accept software as in 1) but also allow keys signed by another authority/the user. This allows the user to flash in their own key and spin their own Linux/BSD/alternative OS and sign it so it will work with secure boot. NOTE you would also need custom mode in Windows 8 if you are employing custom or in-house drivers or other software that talks too closely to hardware.
3) setup(?): Seems to be a special mode--I think it is a one time setting that changes back after reboot? The setup mode is so that your software installer--an alternative OS or a driver in Windows or otherwise, would be able to push its key into te system's firmware during the install process so you don't have to do that step in the UEFI setup manually. Once a key is installed from a software setup process the system would revert to custom modefor subsequent boots.

Besides that UEFI secure boot can be disabled entirely so you can run unsigned system software and none of the above would matter.

The deal with Red Hat and the Devil (um, the evil Microsoft one not the cute FreeBSD one) commits Microsoft to distributing keys signed by them to anyone who ponies up $99 and fills out the requisite forms. In return you get a key to sign your own OS or other privliged software (drivers/kernel modules...) issued through a Microsoft CA that will work in mode 1) above. That is, you can create a distro or driver setup disk that will work with a "factory default" UEFI setting.

I personally have no problems with this scheme except for two critical points:

1) Microsoft alone is the caretaker (cert. authority) for ALL standard keys. This constitutes a monopoly. Monopolies are not illegal but using them to supress potential competitors IS illegal, and this arrangement sets up Microsoft with the ability to get into amti-competitive shenanigans (again). The $99 fee is not a problem--there is no expiry on your key and you can sign all your stuff with it--I may get one for my own business should I run into issues with custom mode or disabled secure boot. A BIG problem is that nothing commits them to being honest with the CAs. There isn't going to be just one root cert form Microsoft, and nothing stops them from using a "special" certificate class for the $99 certs. That would let them revoke all of them "killswitch" style for whatever reason (the root gets compormised, or they just don't like what they keys are being used for), so anyone who does a bios update or gets a new machine would be SOL if MSFT doesn't re-issue you a new key and won't take another $99 from you.

2) Microsoft is not being platform agnostic. There is ARM and "everything else". MSFT has decreed that ONLY standard mode is permitted on ARM devices that have Windows installed--NO custom or setup modes and NO disabling of secure boot. Furthermore I am not sure if the $99 keys will work to build software for ARM devices (anyone know that one? MSFT could issue certs that only work on x86 architecture if they wanted to). You cannot get a shiny "built for Windows 8" sticker (who cares really) and it is against the license agreement to even install on "insecure" ARM hardware (THAT is something to care about). MSFT is (currently) an inconsequential player in mobile/ARM space so there isn't a big risk yet. However, they could leverage their desktop monopoly to push Windows 8 slates and smartphones in the enterprise and even elsewhere (smart glass in the home for example) and if they are successful it would entice vendors to lock out custom OSes.

Regulatory authorities are going to have to keep a close watch on how MSFT conducts itself as sole steward of secure boot keys. The system should be more federated in my opinion, modeled after how root cert lists are distributed with browsers from a number of security vendors even in Free browsers. The DOJ in the US for example should be prepared to issue a consent decree involving MSFT divesting itself of issuing secure boot certs because there is a clear potential conflict with a dominant software vendor being in charge of keys that control what software runs on typical hardware.

Problem 2) should be addresses ASAP. ARM and/or its licensees must not tolerate being treated like this and Microsoft should be forced to remove all architecture-specific stipulations from its secure boot spec. The difficulty is that the culture of the mobile industry is toxic--all ARM powered consumer mobile devices are locked down to some extent. Apple is very heavy handed and even most Android vendors, in cahoots with wireless carriers, will lock up their phones tight, forcing users to use adulterated, crapware-laden and often out-of-date Android derivatives of their own design. For them firmwre lockdown is business as usual so very little inducement is required on MSFT's part to get vendors to lock down ARM hardware.

To me, secure boot is too heavy handed, though the concept is sound. There should be no "standard mode" really--only "runtime" and "setup" mode. You could have a POST screen that says "press DEL to enter setup mode" then throw in your install disk which puts in whatever keys you want as part of the OS install through an industry-standard API. Setup mode could be used to update CRLs or add new keys for new driver vendors or whatever. The whole "standard mode" with Microsoft controlled keys betrays the real reason for secure boot--not elimination of viruses but closed software vendor's control over hardware and end users.

Comment The article is about a *KDE release*... (Score 2) 134

...yet the article poster's commentary, the first post and most of the discussion threads seem to focus on GNOME 3 and other competitors.

I gave up using KDE pre-4.x and haven't really given it serious consideration since the reansition to 4. I know I've missed a lot of changes, improvements and so on so I am profoundly disappointed with the tone of discussion here. Not only is the first readable post an anti-GNOME troll/joke, at the time I read this it is rated "informative". What in God's name are the moderators here smoking? It is a JOKE--yes I do find it very funny but it is 100% information-free. Let's hear about what people LIKE about what is in KDE and what is coming. The signal-to-noise ratio is low even by /. standards of late.

If KDE designers rely so much on "user feedback" then where is the user feedback here? Is the only place to go for a lucid discussion of this desktop environment somewhere in the bowels of the kde.org website? If that is the case then they aren't really getting effective user feedback--they are getting "fanboi feedback", which over time might prove just as effective as "rodent pets [whispering] great design ideas to the developers". What does it have to offer to people to make it worth migrating from GNOME, or Unity or Xfce or LXDE or Enlightenment? In the case of ALL those alternatives if you talk to advocates they will lead off with the virtues of their chosen platforms (GNOME extenstions are a really cool concept, Xfce and LXDE are lightweight and present a familiar interface, Enlightenment brings eye candy to lower powered devices and so on).

Well, sorry, flaming GNOME is about the least effective way to get converts. When GNOME 3 came out and a good chunk of the user base went looking, KDE was NOT where they went. By and large, they chose to try Xfce over KDE and even to fork GNOME 2 or re-spin GNOME 3 with extensions. That users looking for an alernative would "re-invent the wheel" over flocking to KDE speaks volumes about either the lack of awareness of KDE or the lack of appeal of KDE itself, doesn't it?

Do KDE fanbois really not remember when KDE 4 came out how dreadful it was? It is essentially the SAME discussion that GNOME 3 detractors are having right now, right down to Linus Torvalds himself totally slamming it and publicly abandoning it for another desktop! And guess when so many people stopped really considering KDE? So if KDE 4 really IS so much better now that it is up to 4.9 please elabourate out there KDE fans. Stop telling me how much GNOME 3 sucks. I KNOW how much it "sucks" because I USE it (consequently I know how much it *rocks* in other ways). Well me why on my older desktop I should stop using Xfce...is KDE really an option there? Xfce seems like a nice transition for "GNOME 2 refugees" like me. Is KDE 4.9 going to be disruptive? If KDE is "too different" from GNOME 2 then I might as well keep using GNOME 3, or try MATE or Cinnamon or use Xfce more often. What are KDE's merits these days?

The only real impression I get is that KDE is a "tweaker's desktop"...that it is very featureful and can be tailored to do many things. However this doesn't convince me because:

* GNOME 3's extentions capability seems to have all the potential to enable tweaking a user needs (POTENTIAL...if/when the library of exensions grows it could be really good)
* I am not a desktop-tweaker. A dektop is to me an app launcher, consequently I don't miss many features. I kinda-sorta miss minimise buttons in GNOME 3 but I actually am surprised how rarely I actually used them (and there are extensions/settings to enable them anyways if you really need them). About the only things GNOME 3 still seriously need are some tuning of the default user experience and an "expert mode" control panel for those who DO like to tweak a bit more or for people to initially set up their environment in a new install. Overall I do not want to be aware of the desktop--I want it to stay out of my way.

So, is my opinion of KDE skewed? Is it (still?) geard towards tweakers? How is the "just out of the box" experience--how much personalisatoin does a typical KDE user feel they have to do? Has it gotten more efficient with resources since the one time I tried KDE4 in the dark days of the earlier releases when it consumed everything it could find?

In any case, in the spirit of this discussion on a new KDE relase, I shall now provide my opinion of GNOME 3:

It is a skinky old piece of lindburger cheese that I dispise...sorry, just got caught up in the moment I guess....I don't actually feel that way...

I actually think GNOME 3 is simply going through the same painful re-invention process KDE 4 went through. Moreover, I think they are actually moving faster to address shortcomings than KDE 4 devs seemed to take. I do NOT see it as being excessively biased against desktop users and I find myself falling into some convenient habits like using the hot corners (and being frustrated when I go on another computer and the "activities" hot corner doesn't work). I also like some of the little extensions you can get to "fix things" (the best of which will hopefully make their way into the default setup..once can hope). On the "needs improvement" side there is a lack of "manual override" stuff. For example, If you cannot autodetect a printer the add-printer facilities are broken in Debian (you need to call up from command line the "classic" dialogue or else you get a cryptic error--but I've only once had to add a printer manually--they always autodetected except in the one case). System settings are not extensive enough without extensions either and the multi-monitor operation needs to be refined a bit.

But it really isn't THAT bad! I think I won't give up as easily as I did on KDE...firstly because Linux OSes are about choice and we can't all just flock to the "least bad" choice until it is the only choice, and secondly because it is stable and fast enough to still be usable, whereas my first day with KDE 4 was my last becasue it was so slow and unstable on my hardware it wouldn't even function (desktop getting in the way in a BIG way).

Comment Re:Wonderful Support... (Score 1) 627

You have ANY idea how many millions of dollars is made in sales each year in part by some VB+Access DB? Hell I've even built a few of 'em myself and last I heard they are all still running, doing what they are supposed to do. And that's just the home grown apps, do you have ANY idea how many small, say 5-10 man, software houses there are out there writing for Windows?

I have been involved in that market as well with some ov my old work out there having served for a dozen years, and it is indeed a part of the industry that escapes the attention of most observers. VB+Access is to a nemotode worm as the Excel "Database" is to an amoeba (in other words, it is just one step up from the lowest form of IT on the evolutionary scale). It is unsophisticated and lacks robustness and scalability, but it serves its purpuse just well enough that its small business and departmental users are not motivated to change. Sad as it is, such applications are often "mission critical" to these smaller niche operations. People would be really surprised at how much certain segments of daily business rely on some of these crufty old systems.

Incidentally, right there longside the VB+Access apps are the Foxpro apps, which are even older and crustier in some cases...

All she had to do was typethe first two letters of what drug they were on and a drop down popped up that she could just tap and fill in the blank [...] Nope because i doubt seriously you find any software in Linux that is as highly specialized as nurses charting programs and even if you could you'd have to pay someone to transfer all that damned data and for what? What would they gain?

The gain, if Free or open source solutions are adopted, is the assurance that you own and control your data. This is not a sophisticated program, even if it is targeted to a very specific niche. It is probably a VB+Access variant too. Re-engineering this or transferring data out would be trivial. I have done such things with those prox card building access systems' software, though not to port it but rather to integrate with it. The barriers are not the specialised nature of the software (because the software is actually quite, um, "basic"), it is the deliberate barriers that the vendors put in to lock in their customers. I've seen this many times--they password-lock their access databases, obsfucate their database scemas and data, rename the .mdb file to a different extension to divert attention from what it really is and so on. But even so, it is rarely insurmountable to unlock your data in some form or another, so long as there haven't been some contractual/legal lock ins imposed by the integrator or vendor (the perils of adopting closed software aren't merely technical lock-in--they can be legal too).

It amazes me that so many in the Linux world complain of the "Windows tax" and act like 'free as in beer' is a selling point when honestly? For most the price of Windows isn't even in the top 5 of their expense report. If you look at Windows having a 10 year support cycle (which is now standard on ALL versions of Windows) that is $8 a year for Windows home (unless you buy the family pack, then its just $4) and $14 a year for Windows pro....THAT is supposed to be high? hell most of my customers, most of my family even, spend more on stupid crap in a week than Windows costs per year.

That has to be the most misleading, ignorant and crude TCO analysis that I've ever seen! "Retail license cost divided by number of years of vendor support" is NOT THE COST OF WINDOWS. It isn't even the MONETARY cost of Windows! What about the ongoing support costs? You need applications, you need anti-virus ad security licensing and support (there are support costs to MSE even if it is free, even if it is just time), there is the cost of hardware upgrades because new versions of Windows have NEVER done well on 10 year old hardware--even WinXP after 3 service packs and using modern applications is far more resource intensive than the initial release of XP and the software that ran on it at the time of its release. The contention that Windows costs less than $15/year is just as ludicrous as saying Linux costs nothing!

Linux is compelling in the server room because MSFT MAKES it compelling, by having insane EULAs and crazy license requirements like per user CALs. If MSFT wanted to wipe Linux out in the server room they could simply offer WinServer at $300 and no user CALs but they make so damned much money off of server its not worth picking up the low end sales to them.

If you really have worked in this industry space for awhile then you'd KNOW that smaller and more specialised business with in-house developed or "boutique VB+Access" apps face a lot of the same kind of total crap! I've mentioned the pseudo-barriers put in to lock out competition. I've seen the insane license and support costs of those specialised applications and know that they are just asking for competition from more open alternatives. The thing is that the lock-in tactics are even more effective in this space than they are on the servers. That nurse with her tablet or the clerk at the mall or the self checkout you use at the supermarket or the ATM at your bank or the shift supervisor at te receiving dock... they all suffer from the lock-in they unwittingly choe to accept in the last 15 or 20 years. in NONE of those applications does Windows have anything to offer--these are often systems with NO tightly-coupled integration with enterprise systems, go throught great pains to completely hide the Windows desktop, and the users have no need to run Photoshop or the latest 3-d first-person shooter or Quickbooks or tie into an ACtive Directory group policy system. These are "applicane apps", and nobody would know or care if it was Windows or Linux or OS/2 or DOS even that was underneath.

Well, at least nobody but those who supply or support those systems. And don't think it is the customer that has demanded a MSFT based system--that is not the concern it once was because a) there is more awareness of Linux and other platforms, and b) the customer does less and less in-house support, relying on the vendor/integrator/developer for that. Niche business apps for departmental or SMB use are just that far behind the enterprise on the IT curve if you really look at it--behind and still moving slower and so losing ground. Linux started in 1991 and I worked on my first commercial/business use of it on servers in 1997, and it was considered "radical" but is was quite a success, and it took a couple more years even after that for Liunx to make serious inroads beyond web serving--so that is six years from the beginnnings to "early" adoption by large business. Change can take upwards of a decade.

In small business and in specialised applications things are even further behind. These are the kind of outfits that have a LAN but maybe not even a proper file/print server, where Win XP is still on a good proportion of their machines--two versions behind and going on three. It was a big step to go from pen and paper and clipboards to spreadsheet software, and it was a big milestone to invest their hard earned money in that special VB+Access app 10 years ago. It could take up to five years more before they make another big move. There IS opportunity for Linux and other Free software solutions out there. There are a few of those 10-year-old-plus stale old VB and Access stuff that are ripe for replacement and that market is growing. There does have to be considerable effort to unseat incombant MSFT-partner vendors out there.who have those established customer relationships (and have probably graduated to using the whole .NET Visual Studio toolchain). But in some cases the small vendors/integrators are now defunct, or they are not all that tight with MSFT and may even be willing to offer more open solutions on different platforms.

the desktop is the exact opposite, they have economies of scale so large that they can sell their product cheap as hell and still make billions. While i actually like Linux in the web server and embedded roles there is simply no real selling point for Linux on the desktop.

There are HUGE selling points for Linux on the desktop, but you have to be looking at the right desktops! How about "counter tops"--as in point-of-sale systems? There is a small liquor store chain where I live that already uses a Linux based solution--the store managers appreciate the lack of prevalance of viruses on the systems and that they are pretty low maintenance and DONT run Windows software/games. What about desktops in factories, refineries and warehouses, where they run some kind of Java- or web-based front end to some ERP applications or specialised data gathering tools? Windows holds no advantage of familiarity because such desktops almost NEVER sit with the normal desktop environment exposed. These are the places where Windows and its ubiquitousness are actually more of a liabiliity becasue of all the precautions against viruses and having to put in extra effort to lock machines down (prevent use of removable media, install of apps and so forth).

There are selling points for Linux, but they must be matched with the irght end users.

Comment Re:Have You Accounted for User Preference? (Score 1) 204

If your entire office is on LibreOffice, I can see it working well within the office, but once you start sharing documents with external partners, I'm really surprised you've had zero problems.

It seems that the requirement to share complex documents with external recipients in Microsoft Office format is rapidly declining in my experience. In my work I am more and more discouraged to share word documents for example--it is preferred that those be kept internally and that they be put into PDF format to be sent out. The same goes for spreadsheets in many cases too. When such documents ARE widely distributed they tend to be fairly crude or simplistic--Excel spreadsheets that are basically checklists or contain very simple machros and formulae.

I suppose it depends on the industry in which you work. If you have huge macros from hell, strange embedded and linked objects and blinking, beeping, rotating animated logos festooning your office documents then you will have real problems. The problems have not been insurmountable for me personally. I suppose it is the 80-20 or 90-10 rule. The vast bulk of users won't have huge issues. Those that do tend to be "special cases". And, in those special cases if they are distributing such documents they are NOT well recieved by external recipients quite often--even if they are all-MSFT shops. If you have huge macros or special Excel add-ins, or really uncommon embedded or linked objects or are using other arcane features they can break even on other MSFT systems--whether it be some dependency not being installed or IT policies restricting what macros can do or whatever.

It is not too much to ask people sending you documents to send them as PDF, or warn them that for security and interoperability reasons that such nasty complex documents cannot be accepted or may not be fully accessible--at least for the vast majority of users, becasue really, of an Office file is so complex that it is unusable in LibreOffice you are even inviting trouble for users in other MSFT setups too in terms of differing versions, patch levels, installed software and so forth. Complex and/or extremely large MS Office document files are a sign that the wrong tool is being used for the job and it is time to invest in a real solution (Excel is not a database, or a project management system or an enterprise reporting engine--using it as such is akin to using a butter knife as a screwdriver or a stilletto heel as a hammer--it functions but not well at all and it will eventually break from the abuse).

Comment Re:Whatever happened to Perl 6? (Score 1) 192

This scares me. The fact that there are two compilers (which sows confusion) and none is feature complete

Why is that so scary or "confusing"? There are a lot of compilers for C and C++ out there, and they aren't all "Feature complete" (MSFT's C++ compiler for example is not fully C99 compliant and they don't seem in much of a hurry to make it so). Doesn't seem to me to indicate that a language or platform should be discounted.

... and it looks like Parrot was dropped.

Can you give a citation showing this? A new release of Parrot came out just last week. It looks far from dropped--in fact it looks like a VERY actvie development community--they seel to put out a new release every month or so! Also, the latest Radoko compiler for Perl6 is only about a month old itself.

I won't argue the point that the dev timeline is somewhere between "duke nukem forever" and "GNU HURD" in terms of release to market timelines. The devs do seem to be pursuing development of Parrot and Perl6 more in terms of an academic or research exercise than in putting out production code. However it is a complete misstatement to say Parrot has been dropped.

Comment Re:Perl renaissance? (Score 1) 192

are we in the middle of a Perl renaissance?

I hope not. I have to maintain a large body of Perl code at work, and it's a nightmare.

Funny that. I've had quite similar nightmares with Java and PHP, which are supposed to be all that. My history with Java has really REALLY put me off that languaage (and now with Oracle on the litigation warpath I've resolved to avoid Java whever reasonably possible--I'd even wipe my Galaxy S2 of the Java-esque Android and put on boot2gecko if the latter was really ready for prime time). In the case of PHP I found it to be like a walk in the pasture--an easy hike to start but then you step in a pile of crap and can't get that smell off of you. Perl, on the other hand, I learned in the mid-late nineties when I was tasked with authoring simple CGI scripts for an Apache web server on Linux, and I took to it fairly quickly and didn't get trapped in a nightmare. Perl certainly gives you opportunity to step in something smelly, but the simpe requirements in my situation didn't really invite that risk.

Perhaps it isn't the language it is the authors (don't take offense if the large body of Perl you must maintain was written by yourself). I think my dislike of Java comes from the fact that for most of its existence it hasn't been used to do anything "fun". Java went all "enterprisey" and enterprise applications are just big slow stinky piles of anti-patterns. Maybe if I learned Java coding through programming games for Android phones instead of delving into the guts of some WebSphere appllication i would have thought different, but Android and mobile apps just didn't exist beyond what was hard coded into your dumb-phone. The nightmare isn't really Java itself, it's that it is most often used on "nightmare projects" driven by committees using stodgy old "waterfall" management methodologies, resulting in througly unpleasant software (for some reason enterprise software has to be wretched to use--if it isn't as slow, cumbersome and confusing to use as SAP then you aren't ready for production yet).

I remember when PHP itself was a Perl library (how ironic is that, with so many people saying Perl is dead and you should move on to PHP or something like that?). I didn't give it a second look at the time because the PHP library didn't provide me with anything *I* needed at that point). Then somewhere along the line that was dropped in favour of a completely re-implemented "Perl-free" version that, well, looked and behaved like Perl's retarded little brother, and that didn't impress me much (but I DO understand that was probably why it caught on--less sophisticated--easier to adopt). Many years went by and PHP gained a lot of traction and got more complex and now I was confronted with the need to work with frameworks and content-management systems written in PHP. Guess what? PHP is not used to develop anything "fun" either--it is just un-fun in a different way than Java. Instead of dry, accountant-like programmer-analysts of the enterprise-Java world, you had freewheeling wunderkind "web developers" who churned out big balls of PHP goo floating in HTML tag soup. No anti-patterns here--no discernable software design patterns at all actually. And it seems that PHP develpers were the slowest learning programmers of the entire world--even after VB6 devs were refactoring their "code" to implement paramaterised ADO queries it seems PHP devs happily continued to cobble together un-escaped strings and inviting little Bobby Tables to own their arses. When I eventually delved into the guts of the likes of PHPbb and Wordpress and Drupal modules I felt a bit like PHP was in over its head--constantly being called upon to do just a bit more than it was designed to do and that PHP was continually catching up.

I know that Perl is no better--I'm sure that there are critical enterprise systems out there supported by crufty, cryptic Perl that looks like line noise. I'm sure someone out there could cite SQL Ledger as an example of "crufty ugly business Perl" for example. Lord knows the early days of the web rested upon loads of crude, insecure tainted Perl as the dot-com bubble was inflating, but back in the day Perl was "fun" and "cool"--people wrote Perl poetry, people had contests to make it do as much in one line of code as possible, and Perl is the language of choice to create executable ASCII art! Back in the dark ages of the BLINK tag web pages were uncomplicated and as such I was doing uncomplicated Perl code. Gerring into Perl was"fun", not "work" like Java and PHP weree in the contexts where I had to learn those languages. Further to that, Perl remained fun for the most part. Plus, CPAN was the worlds first "app store"/software repository. Perl is like "programmatic lego" than any other language out there. NO other language has as big library of modules as Perl does.

So perhaps the love or hate of perl is more to do with personal context of the programmer than with the characteristics of the language after all. Your first impresion of Perl was probably because you drew the short-straw at work and were foced to learn Perl in the context of maintaining old legacy ad-hoc systems written in Perl by some old grey-beard that has long since retired and/or passed away--and it is uncommented and unstructured and written in Perl 4 style. Of course you'd hate that! That's how I was formally introduced to Java and PHP so I can understand!

Comment Re:Underestimation? (Score 1) 585

Sorry, but I believe the number should be going down.

You have to looks at BSA's surveys with a skeptic's eye. Their definition of "pirated software" is very fluid. The BSA in essence decides what the results are supposed to be then manipulate their data to conform. They don't say exactly that "we want the survey to show 57%" or "company X is short 15 MS office licenses", but they DO certainly go into the process with the full intention of reporting a "growing number of pirates" or "company X is out of compliance". Survey results are then manipulated to that outcome. (disclaimer: I do not have "smoking gun" evidence, but anecdotally in terms of how they've conducted themselves with those I know who have been audited--there were pretty convincing indications that they had decided the outcome of the audit in advance).

What the BSA also does not report clearly is that they DO consider this survey a sign of success in a certain sense. It is a survey of people "admitting" they pirated, not a report of audit results. This does not indicate growing piracy--it indicates growing AWARENESS of piracy. For example, with Windows there are retail, OEM and volume licenses of the OS, and you cannot, for example, LEGALLY format the hard drive of an OEM XP machine and use a "full OEM Win 7" disc and license (perhaps because it was obtained cheaper than the upgrade version) to upgrade, even if you can technically achive this. Though "BSA education" more people know this is piracy even if you "legally own and paid for" the OEM version (the OEM version can legally only be used on a new PC).

So, it isn't that more pirating is going on, it is that more people are aware of (or being made aware of) the complexities of closed software licensing and are admitting to "pirating" by way of such practices as incorrectly purchsing and using OEM installations.

Soon enough, people will realize they don't need buggy siftware when they can get Linux and Open Source software that is better and mostly free

Again, this is a survey measuring what the respndants THINK is piracy (or are led to believe is piravy). There could be a lot of false reports. People are becoming more AWARE of piracy but are still trying to fully understand what it really is. Although as I stated above more people know that you more often than not do NOT "owm" the software but rather "the right" to use it, and that those rights are limited in varying degrees (ie. OEM vs volume vs retail vs upgrade Windows grants you different levels of rights), there is still a (mis) conception of "no monetary payment == piracy"--of all misconceptions that is one BSA is trying to preserve and perhaps even enhance.

Especially the way BSA frames its reports and debates, how possible is it that "pointy haired bosses" out there reponding to BSA surveys know that their IT guys swapped out their creaky old file and print server running Windows 2000 with a new box running CentOS, and since there is no record of payment, or an activation certificate, or a product key, that perhaps they were convinced by the BSA survey that perhaps it is a "pirated copy" of RHEL? The thought is that there must be concrete evidence of licensing like a receipt or activation certificate ot whatnot that can be presented as proof in an audit, so just downloading and burning ISOs or installing from a repository on the internet just looks like piracy even when it isn't.

Even if they know their Linux systems are in compliance, it is known that some businesses make sure to have records of a Windows license for them "just in case". Some schools an librares in the past have had their Linux machines called "out of compliance" merely because the BSA auditor believed they were converted to Linux AFTER the effective audit date (the date on the letter informing the target of the audit--any licenses purchased or software installs altered after that date to comform with licensing is not considered sufficient to pass an audit). In most cases the Linux migration took place before the audit but the cost to appeal BSA audits is astronomical--out of reach for a school or library or small business. Settling with the BSA or just buying licenses that go unused is just the "IT tax" cost of doing business for these people.

As such I do not see reported piracy going to zero for a long time--not a long as closed software continued to dominate in IT. If you are "all Linux" you won't get much grief from the BSA., becasue potential revenue from damages is limited for them. However, in heterogenous environments with high potential infringement good recordkeeping is going to be key for those who are apt to be targeted by BSA. Even if your software is Free it might be wise in a mixed environment to keep some solid evidence indicating when it was put into use. I don't know how far you would want to go, but perhaps the tin-foil-hatted amongst us could send a DVD in a sealed envelope to themselves via registered mail on a regular basis.

All in all though, almost all closed licensing schemes and their enforcement mechanisms are bullsh!t and the typical small and medium business targeted by the BSA for audits would be best advised to make maximal use of Free software to keep the cost of license management and compliance to a minimum.

Comment The problem with stereotypes... (Score 2) 474

...is that they are even harder to kill than cockroaches. When the big nuke goes off in the sky and wipes out humanity, all that will be left are cockraoches and they will be using Windows because they think Linux is "only for comp-sci majors".

Linux is a great idea and has many powerful tools, but for everyone who's not a comp-sci major, the OS is just supposed to launch the programs you want, and preferably do it fast.

Using the "powerful tools" in Linux is not a requirement. My parents use a web browser, and email client and Libre Office 90 percent of the time. Ten percent of the time they play solitaire. and copy the pictures off their digital camera because they've filled their SD card with pictures of the grandchildren and great-grandchildren. They can do that on a Linux OS without "powerful tools" just as well as they can on Windows, maybe better.

It was those pictures that made my dad in particular interested in migrating away from Windows. They caught a virus that rendered their old Windows system unbootable and they had hundreds of pictures that had not yet been printed or backed up to CDs or DVDs on there and were quite upset that they may have lost all those pictures. I used Trinity Rescue Kit bootable Linux CD to recover their files, then reformatted their drive and installed Ubuntu. They still keep their Linux machine because "Windows is easy but I don't trust it with important files anymore", and they also hanve found F-Spot and Shotwell to be faster and easier to use than the crapware supplied with their digital camera for Windows.

Those same people could have avoided all that junk installed on their pc if they'd just bought a computer assembled by an enthusiast company or a local computer shop in the first place. Those low prices at Best Buy or many online retailers are subsidized by all the crap they pre-load the systems with. Complaining about the crapware on an HP is like complaining about the ads on a "Kindle with special offers".

OK, first you suggest that Linux is for "comp-sci majors", then you suggest the solution is to buy a PC from a system builder? I do agree with you, and it is why I bought a "no name" Clevo notebook online from a build-to-order vendor (finding a notebook with interchangeable discrete graphics cards and CPUs and is not pre-loaded with crapware-laden Windows that has no proper re-install media is impossible from a big box store). However, The kind of people who could not adapt to a change of an OS from Windows to Mac or Linux wouldn't have the first idea of where to go anymore. It seems that local system build shops are trending in the direction of video rental stores--sure some may be around forever, but they are the domain of the computer enthusiast, and that is a very narrow market. Also, the general public has a certain "comfort level" with the big chain stores--they know what they are getting (even if they don't always like it, at least they have expectiations). It is probably just as easy for average users to have someone format and resinstall an OS (Windows again, OR a Linux OS or whatever) than to spend time to seek out a local computer shop and worry if they are trustworthy.

In my area we are lucky--there is a regional chain called Memory Express that builds their own line of "Velocity" desktops and servers, both pre-configured and build-to-order. That is the closest you can come in my area to a "local system builder" that can offer you crapware-free computers. However they do NOT offer build-to-order notebooks--you can choose from Lenovo, Acer, ASUS, MSI and so forth--the one consolation is that their brand selection is very diverse so you can fild one that is relatively crapware-free. However you are still spending extra time shopping, or extra money booking with their service dept. to give it the MemEx version of the "signature treatment".

Comment Re:Not making money = wasting money (Score 1) 141

I think that the goofing off discussed in the article is not of the browsing facebook and playing freecell variety. It is the kind of thing that we would otherwise call "research and development" if it were conducted by someone with a PhD.

Perhaps it is a concept better sold as "Integration of R&D into business operations" rather than "goofing off" or "skunkworks".

Slashdot Top Deals

The nation that controls magnetism controls the universe. -- Chester Gould/Dick Tracy

Working...