Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Assumptions are the mother of all ... (Score 1) 172

Unfortunately, I'm in the UK, where the selection is much more limited.

For example, Dell UK's web site lists exactly one laptop with a 17+" screen and SSD, and it is also a touchscreen and comes with Windows 8.1.

HP do at least promote the Windows 7 option (via Win8 downgrade rights) for the high-end ZBook laptops on their site. However, the pricing on those tends to make the closest equivalent Retina MBPs in specification look cheap.

Also, Microsoft UK don't seem to have any high-end devices at all within their Signature Edition range, so it's invasive crapware city all the way with a lot of the big name brands, even on their expensive, high-end models.

Comment Re:Assumptions are the mother of all ... (Score 2) 172

But the screenshots I've seen of Windows 10 still mostly look flat and/or garish, and it seems to be more a case of trying not to make the visuals too much worse than what is already available via Windows 7 than actually trying to be better. Another example is the icons, which have gone from being widely ridiculed to being... well, slightly less widely ridiculed... in all of the reviews I've come across, and with considerable justification if the examples I've seen myself were representative.

It's not just the visuals that put me off, though. It's also the fact that I use a traditional desktop PC with multiple large monitors, and I want an OS and software that work well in that kind of environment. I saw a review the other day of the new preview release where literally every screenshot that had substantial content in it also included the word "tap" somewhere, with obvious concessions to touchscreens that just don't make sense for a desktop workstation. This was one of the big problems with Windows 8, and it seems like with the Surface tablet hardware and Windows 10, Microsoft are doubling down on touchscreens. #donotwant #haverealworktodo

I'll wait to see what people say when Windows 10 actually ships and we're not just talking about preview releases and educated guesswork, but so far the signs don't seem promising. Windows 7, on the other hand, is tried and tested and works just fine on the numerous computers I use it with today, so as I said, if I could buy an approximate equivalent with newer and more powerful hardware right now, I'd be right in there. Sadly, I'm in the UK, and what you can pick up over here is quite limited compared to what you can get in say the US.

Comment Re:Assumptions are the mother of all ... (Score 2, Insightful) 172

If I could find a good high-end laptop that came with vanilla Windows 7 instead of 8 and all the pre-installed extra junk, I would be throwing money at the supplier and begging them to sell me one. That has far more to do with avoiding more recent versions of Windows and their kindergarten, touch-obsessed UIs than it does with wanting a cheap upgrade when 10 ships.

Comment Re:essential to know about jQuery (Score 1) 126

Given the fact that this is a third-party library that you are unlikely to modify, hosting it on your own servers provides no advantage whatsoever.

Of course it does. It has the same advantages in terms of security and your visitors' privacy as any decision to host your own material instead of quietly using a third party service. Whether you consider those significant advantages is a different question, and whether your visitors would is a different question again, but clearly there is a difference.

Comment Re:essential to know about jQuery (Score 1) 126

It's very likely that people would already have the CDN version in their browser cache since a lot of website use that link.

This is a popular claim, but what little real data I've seen says quite the opposite. There are so many different minor versions of a library like jQuery that the chance of any given visitor to your site actually having visited another site using the exact same version from the exact same CDN within the cache window turns out to be pretty low.

There are still reasonable performance-related arguments in favour of hosting static content on a CDN, and for splitting resources across domains unless you're in SPDY or HTTP/2 territory, but those aren't quite the same issue and you can avoid them without resorting to loading libraries from third party hosts you don't control.

Comment Re:Backing up user data on Linux (Score 1) 517

For a server it's different because each service has its own location for config and data, but if your job is to setup and manage the server then you should know what its running and where those services keep their data.

That's a great theory, but in the real world numerous people rely on servers that don't have a dedicated admin, so these things do matter and "You should know everything about everything" isn't a terribly useful philosophy (leaving aside the often incomplete nature of documentation in FOSS world, which can make it hard for even a competent and generally knowledgeable admin to actually know everything they need to here).

In this context, I'd take backing up user data and reinstalling Windows and its applications over backing up user data and reinstalling Linux and its application any day of the week.

Comment Re:Security team (Score 1) 517

So they should never run scans because every time your computer is on you are using it?

The kind of entire system scan that slows everything down for an extended period? No, probably not. Those scans are mostly worthless from a security point of view, and have a high impact on the overall efficiency of the system.

They should never patch and just let well known vulnerabilities run amok because you don't want to be inconvenienced, either by having to leave your machine on or wait while patching happens?

Of course not. But we aren't talking about rolling out the approved updates across the organisation after Patch Tuesday or whatever we're calling it this month. We're talking about regular scanning that routinely interferes with normal use of the system.

You left them no choice by giving them no time that wasn't work time.

There are plenty of other choices, starting with having sensible security practices that don't routinely undermine systems at all, and closely followed by having a standard procedure for applying security updates in a timely fashion that allows for things like people being out of contact for extended periods and provides for notifying them of any urgent threats while they are away and then getting them fully caught up when they return.

If the process of installing updates and perhaps a reboot on a Windows box is itself taking so long that it can't be done in the background while someone is making a coffee, again you probably have bigger problems to deal with and need to consider whether the spec of your systems is good enough what what you need to do with them. But in the real world, this is almost never a problem in practice if you have a remotely sensible set-up.

Comment Re:Security team (Score 1) 517

That's how you see it, not how IT, nor Management, nor lots of other orgs see it.

Frankly, I think it's how responsible IT and smart Management see it as well, and I don't know what "other orgs" you mean so there's little to say there.

IT is a support function. The purpose of support functions is to support the primary functions of your business. Any time your support functions start undermining the primary functions, that should be robustly justified, or the people who want to do it should be told "no". It's really as simple as that.

As for your example scenario, that's the kind of foolishness that costs real businesses money all over the place. I bought some quite expensive household goods a little while ago, and as it happened we were just finishing up the paperwork at 8pm as the showroom "closed". The sales guy was incredibly apologetic about how he couldn't print the last form we had to sign -- which was the important one that guaranteed us the goods and them the sale -- because their central management system went off-line for something-or-other and despite it being 8:01pm and him having a high value customer waiting to complete a sale, he couldn't.

As a direct result of the poor policy imposed on the local store by some genius in central IT, they were at risk of losing one of only a few final sales they would have made that entire day; in fact, if it had been one day later, they would have done, because we would have been on holiday and so not able to return the following day to finish everything off as we actually did. That is what management technically refers to as a "total screw up".

Actually, their IT systems generally were a disaster. On our first visit, they had multiple people looking around at one point. However, it took so long to put a provisional order into their prehistoric computer system to get a proper quote (seriously, like an hour to do what should have been maybe 5 minutes) that people were literally walking out after waiting half an hour to see the sales guy who was tied up with the other customer.

I can easily imagine based on just those experiences that dumping seven figures into building a modern IT system that could handle customer orders properly would increase their revenues by 25-50% indefinitely. It obviously wasn't a new or unique problem, as the sales guys on both occasions seemed both genuinely apologetic but also had a well-rehearsed patter for how it happens sometimes but no-one ever fixes it.

Comment Re: Security team (Score 1) 517

To be fair, if you're dealing with the level of malware that can cover its tracks against that kind of investigation, and if that malware is already on your system but wasn't picked up on a previous scan, the game is already over anyway and you're well into complete reinstall and restore from back-ups territory. These days, with threats that can hide in other areas of the hardware/firmware to survive the wipe and reinstall process, I'd be wary of trusting even that in any highly security-sensitive environment.

Comment Re:Security team (Score 1) 517

I'm freelance these days, so I'm afraid I can't help. Sorry. :-)

One of my regular clients operates in this field, and seeing things done in a reasonable way reminds me of why I used to get so irritated when I did work as part of a large, bureaucratic institution. It's not magic. It's just being aware of modern tools and practices, and being willing to make the effort (and yes, sometimes, being willing to spend the money) to set up something that provides a useful degree of security but without making things so secure that you forget why you're there in the first place.

Given the potential costs of getting security wrong, I don't really understand why any organisation large enough to be facing these issues regularly wouldn't hire people who know what they're doing and provide a reasonable budget for them to deploy proper tools. I can only assume it's the usual suspects, probably some combination of ignorance and corporate politics.

Full disclosure: Obviously I make money from working for that client and they make money in part from selling some of those tools, so I'm kinda sorta shilling here. But not really, because really, the cost of hiring smart people and giving them proper equipment vs. the cost of say a major regulatory investigation or having your whole sales team at the pub all day because they can't work... not exactly close.

Comment Re:Security team (Score 3) 517

They shouldn't be doing their work at home - which is what the GP said.

Oh, OK then. It's not like full- or even part-time telecommuting is one of the most advantageous perks offered by many modern workplaces in terms of productivity or staff morale, so I don't suppose the business will suffer too much. Should I also recall our entire sales force and tell them they can't work on customer sites any more?

In other news, please be aware that due to a change in company IT policy, next time you get paged at 4am because of a network alert, remote access will not be permitted for security reasons. Instead, you will be required to get up, spend 20 minutes driving to the office, log in from a properly authorised and physically connected terminal, type the same one CLI command you do every time that alert goes off to confirm that it's still just the sensor that is on the blink, type the same second CLI command you do every time to shut off the alarm, spend 20 minutes driving home again, and then go back to bed. Sleep tight.

Comment Re:Backing up user data on Linux (Score 1) 517

The only part I've found complex is finding out where and how various apps actually store their data, particularly when I don't really have much interest in the app.

In a sense, yes, the most important problem is that simple, but as you then demonstrated with things like the database example, "simple" and simple aren't always the same thing.

The other point I was to make is that your example presupposes that all of the packages you need are installed using your distro's package manager. In my experience that is rarely the case, and while there are tools like checkinstall that can help, the lack of any enforced installation conventions or protections against unexpected interactions in mainstream Linux distros means you are always vulnerable to certain nasty problems. Anyone's make install can probably nuke the output from anyone else's. Someone running a make uninstall that removes something that some other project assumed would be present can break the other project. Even if you stick to distro-only packages, there is not always a guarantee of backward compatibility when moving to a new version of the distro.

To me, the fundamental problem here is that for the most part I want an OS foundation that is stable and robust, and other than security fixes I probably never want it to change for the lifetime of the system. On the other hand, I want to be able to install drivers for new hardware or protocols and of course new application software on top of that OS, and I want them to have a stable platform to run against and to be as independent as possible so swapping out one part of the system doesn't undermine any other parts. The current Linux ecosystem with its distro model does not promote that kind of separation and safety, unfortunately.

Comment Re:Security team (Score 2) 517

Until some drone with mapped server drives gets cryptolocker and gets everyone's files encrypted

If you have a network that is wide open to "drones with mapped server drives getting cryptolocker" and causing the entire organisation to lose a day of work, the kind of scheduled scans mentioned above probably aren't going to protect you anyway.

To defeat a threat like cryptolocker you need real-time measures to prevent it operating in the first place: proper scans on incoming mail and web downloads, internal firewalls, and so on. To limit the scope of the damage if cryptolocker manages to get in somehow anyway you need least privilege access controls on your internal systems. And to restore anything it does manage to get hold of, the most important thing is to have frequent back-ups with fast recovery procedures. Scheduling a system-wide full scan so your staff can't use their laptops for 15 minutes at 10am every day is not going to give you any of those protections.

Obviously there is always a risk of some disruption if IT are responding to an ongoing incident or recovering afterwards, but if you're routinely causing significant disruption to your entire staff then there are probably better ways to achieve the results you want.

Slashdot Top Deals

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...