Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Car drivers will often merge into the bicycle lane to make a right turn. Some of the bicycle lines are dashed to show that this is acceptable, because it's technically the legal way things are supposed to be done. For the bicyclist however, this means that you're constantly at risk of being slammed into by car drivers who are doing what they're supposed to do and the only thing you can do about it is refuse to follow the rules. You don't always hear the car coming up behind you, especially if the streets are busy and traffic is getting hectic.
I've found that the safest way to ride is almost always by doing the OPPOSITE of what you're legally obligated to do. Ride against traffic in the bike lane, making it so you are face to face with the people who are most at risk of hitting you. Make eye contact with drivers as you approach. This also helps with people who are parked and opening their doors. People are more likely to recognize you are there because they have a face coming at them, which the human brain is very adept at spotting. I hang back at intersections and wait until it's actually safe to proceed because I can't trust drivers to accept that a bicyclist has any right-of-way. Too many near misses when doing things the "right" way. The rate of close calls that I've experienced has been drastically reduced since I stopped trying to "share the road".
I get this feeling from the comment I replied to that you expect SystemD to do something to solve driver issues. That's a great example of why I'm not fond of SystemD, the people who support it apparently think it's going to fix all these issues that it shouldn't actually have anything to do with. You're not going to fix drivers by rewriting the userland init system.
The flaw in this statement is the fact that systemd is replacing alternatives in such a way that it breaks everything if you try to use an alternative. It makes it so being able to use the same alternatives that have existed since long before systemd came about, is no longer an option. It removes the alternatives. Every major piece of software for Linux that decides to become dependent on systemd removes the ability to consider alternatives.
What I see is projects like GNOME, with a growing dependence on SystemD, becoming unsuitable alternatives because they no longer support alternatives. I see this idea that other systems should be expected to conform to systemd architecture if they want to continue to benefit from said software.
I personally prefer to use cross-platform software. I prefer software that runs about the same regardless of the platform I'm using it on, and I prefer to have the option to use any supported platform to run the software. Now I'm afraid that software I've come to rely on is going to take that possibility away. I'm afraid that I won't be able to use my preferred cross-platform applications on OS X and Windows in the future because they gained some strange dependence on SystemD.
If we reach a point where a full featured Linux desktop cannot be run without SystemD, the entire idea of working on alternatives becomes moot.
VMware Player must have added the creation of virtual machines because of VirtualBox taking a bite out of their pie. But I haven't seen any reason to switch back to it, especially given that it has more limited host operating system support.
I'm sitting here in awe that they turned VMware Player into a paid option. Wow. They really did not have a clue how to respond to VirtualBox.
Perhaps not right this moment, but I think the consumer will want it soon. You can do reasonable stereo viewing on almost any smartphone by using a Google Cardboard type setup. A few have their own manufactured enclosures for it, like the Galaxy Note4 with GearVR or the iPhone4 with the Hasbro My3D. The content for them is a little sparse, but the entire concept is still in infancy. Lots of us grew up with the ViewMaster and I don't think it's going to be long before we start enjoying a true advancement of that age-old technology. The wild west had stereoscopic viewers using the same optical trick, we have programmable screens being mounted into boxes that strap to your head.
Pentium 4 at 4.0 GHz
I see some old news articles about it being canceled, but I remember the hype for it.
That's a single core, 4GHz Pentium 4.
Did they give it a proper refurb before running that test, or are they using the original thermal grease and a clogged fan and heatsink?
Because I highly doubt that test machine (the ONE sample of that CPU) was actually in WORKING condition, given that the 3.8GHz model doubles the score.
That's not even getting into the higher multimedia instruction sets that the i7s have which newer Passmark probably bangs on.
Plus, we saw this happen. The Core line of CPUs was ridiculous in comparison to the previous Pentium 4. Not 'Core i7', but 'Intel Core Solo' and 'Intel Core Duo'.
They made the Pentium 4 look like a serious lemon.
What are those guys smoking? The systemd guys. The GNOME guys.
If I'm a GNOME on Linux user, I'm essentially being forced to migrate to a new operating system to keep using GNOME.
systemd doesn't support other libcs. systemd requires gcc extensions.
Linux, a fine OS kernel which is supported by a wide variety of userland options, and it looks like we're trying to homogenize everything to a degree which is frankly the opposite of what makes Linux itself an amazing piece of work. Things are becoming inextricably linked to components they previously may have INTERACTED with but did not RELY on. The alternatives are being pushed aside by the very depth and breadth with which these newer projects are gobbling up system responsibility.
These issues lie at such a basic level that it's poisoning the entire ecosystem.
I'm curious about how a computer is supposed to be intuitive.
Let's take a journey into the past for a moment and look at historical computing machines, what they were used for, why they were built. I'll take as an example the artillery computer on a warship from the great war that brought the technology upon us. Differential Analyzers were mechanical devices which performed calculators. When it came to the use of these devices in ship-board artillery, the interface was simplified to assorted knobs and gauges where the operator dialed in the appropriate parameters to get the necessary result. However, the action being performed - the calculation of trajectory using mathematics - was in no way simpler or more intuitive, it was buried under an appropriate interface that hid the details necessary for performing the task at hand. This had the result of making it so an artillery man no longer needed to concern himself with the deeper understanding of the task he was performing.
I fear we've run head-long into this case where we expect our tools to do the work for us rather than allowing us to work more efficiently.
I'm seeing 'simplified' interfaces slapped onto complex machines that end up overlooking the details. I'm seeing this idea that the tool needs to to the job, that the user need not understand how the job is done. That is not a good thing.
Computers don't have the potential to change the world, they already have. Unfortunately, as a direct result of how deeply they've changed the world, we no longer feel it necessary to actually learn what we're doing.
We just want the computer to do it for us.