That's good advice.
But there are plenty of things on most networks that aren't critical servers or devices you have the luxury to control and plan.
If you regard security patches as essential only for those things, you're doing defense in depth wrong.
Heartbleed affected clients too, and several things that aren't internet facing services.
Or in the case of Microsoft, discontinue support for the still widely-used Windows XP. Find a vulnerability in that? Too damn bad. It'll never get fixed.
Like when Ubuntu Server 13.04 didn't get a fix for Heartbleed because they discontinued support after 1 year despite the criticality of the bug and the servers seeing considerable use? All the official replies were "it's your own fault" and "change distro version immediately". Which you often can't do quickly. No users really expected 12.04, 12.10, 13.10 and 14.04 to get the fix while 13.04 in the middle was left out - except people who read the really, really fine print and took it seriously. Shipping the security fix would have been trivial and saved a lot of people a lot of work; they just refused on principle.
It was probably the first time many users found out Canonical had changed the support duration (that's why 12.10 got the fix).
Being salaried, but worked so many hours that you effectively make less than minimum wage, is exploitation pure and simple.
I thought it was called "graduate school"...
Newer versions of GNOME (3.8 and after?) rely on a DBus API of systemd's logind component, for reasons I've never seen adequately explained.
The talk of forcing all cgroup interactions to go through systemd would in effect make anything that interacts with cgroups or cpusets such as hwloc, TORQUE, and SLURM rely on systemd. I can't imagine that the developers of hwloc, TORQUE, and SLURM are especially happy about that.
That is not possible, but I think that a quick reboot once a month isn't too much to ask.
Well, that makes one of us.
...when will this result in a 100W Marshall head on a chip?
(Why yes, I am a guitar player! Thanks for asking.)
...you don't get to call yourself a "software engineer" or talk about others' software engineering practices.
Thanks! But too late. That machine died this time last year, after 6 years of excellent service. I moved on to new hardware.
Hopefully the xorg.conf is useful to someone else.
I've just looked up what people are saying about DebugWait, and I see the font corruption - that's just one of the types of corruption I saw!
But perhaps that was the only kind left by the time my laptop died.
Just a note to others, that DebugWait doesn't fix the font corruption for everyone according to reports. But, it's reported as fixed by the time of the kernel in Ubuntu 13.04 according to https://bugs.launchpad.net/ubu...
I stand by my view that Intel GPU support never quite reached "excellent" because of various long term glitches, although I'd give it a "pretty good" and still recommend Intel GPUs (as long as you don't get the PowerVR ones - very annoying that was, that surprise wrecked a job I was on). Judging by the immense number of kernel patches consistently over years, it has received a lot of support, and in most ways worked well.
Getting slightly back on topic with nVidia: Another laptop I've used has an nVidia GPU, and that's been much, much worse under Ubuntu throughout its life, than the laptop with Intel GPU. Some people say nVidia's good for them with Linux, but not this laptop. Have tried all available drivers, Nouveau, nVidia, nVidia's newer versions etc. Nothing works well, Unity3d always renders ("chugs") about 2-3 frames per second when it animates anything, which is barely usable, the GPU temperature gets very hot when it does the slightest things, and visiting any WebGL page in Firefox instantly crashes X with a segmentation fault due to a bug in OpenGL somewhere, requiring a power cycle to recover properly. So I'd still rate nVidia poorer than Intel in my personal experience of Linux on laptops
Now? Intel GPU support has been excellent under Linux even back when the crusty GMA chips were all we had.
Except for the bugs. I used Linux, including tracking the latest kernels, for over 6 years with my last laptop having an Intel 915GM.
Every version of the kernel during that time rendered occasional display glitches of one sort or another, such as a line or spray of random pixels every few weeks. Rare but not bug free.
And that's just using a terminal window. It couldn't even blit or render text with 100% reliability...
I investigated one of those bugs and it was a genuine bug in the kernel's tracking of cache flushes and command queuing.
In the process I found more bugs than I cared to count in the modesetting code.
Considering the number of people working on the Intel drivers and the time span (6 years) that was really surprising, but that's how it was.
No, it only means the normal people have invaded our territory. GET OFF MY (unix-y) LAWN..
Did you know that common kitchen knives can also be used by Billy Joe Bob to blind someone, or worse kill them? Just wait until you manage to tick Billy Joe Bob off. This cannot end well.
Clearly, kitchen knives should not be made, either.
Look, if people are going to attack, maim, or murder someone, they've got plenty of options already. Adding one to the potential arsenal, especially one that would take significant technical know-how to be able to turn into an actual weapon, isn't really going to change things.
Back when you had to turn off such things during climb out and descent was also a problem.
You can turn off screaming children?
One can't proceed from the informal to the formal by formal means.