Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
Now don't get me wrong, I really really appreciate all the hard work the GNU project and the FSF have contributed, but to imply that _only_ the GNU project and the Linux kernel deserve credit seems quite unfair to me.
If we call in GNU/Linux because GNU deserves credit then we should name it GNU/Apache/Xorg/KDE/SystemD/Samba/LibreOffice/Mozilla/Linux, because all those other projects are just as critical to creating the modern, functional operating system that we have today.
Or we could grow up and just call it Linux because its just a name after all.
My theory is RMS and all his buddies over at the GNU project are still butt hurt about Linux stealing the thunder from GNU Hurd (25 years after the fact!) If they really want to have their GNU OS, then just finish Hurd already build your GNU package.
It's amazing how childish RMS can be sometimes, look at how he reacted to the fact that Clang and LLVM are now technically superior to GCC. Wrote a whiny blog post about how he admits it hurts on a personal level and then in the same paragraph attacks Clang as not being open source enough because it is BSD licensed instead of GPL! Honestly I think deep down inside RMS would have preferred that Apple kept Clang closed source even though he would never say that publicly. Apple gives us something for free that they totally didn't have to give us so obviously we should bite their hand off because they licensed it in a way that would allow them to continue using it in Xcode.
There is a lot of things I really like about the open source movement, but self righteous crap and the cliquey project leaders definitely leave a bad taste in my month.
The root cause of this problem and in fact most of the problems I've seen in the work environment is that most managers seem to be completely incapable of understanding the concept of a happy medium.
Honestly I'm not a fan of the 5x10 cube. The walls and the monitor are so close that there is absolutely no way for you to take a vision break and focus on something 20 feet away without getting up and getting a coffee, so you don't do it as much as you should. I can't think of many things that could possibly be worse for your eyes. And often those walls are 20+ years old, have never been cleaned, super dusty and all of them smell like B.O. At the same time, yeah I don't like being on display and having everyone constantly looking over my shoulders either.
Right now my team has our aisle set up with half height walls facing the walkway and doors, the desks set up so that we are looking out into the aisle. The back wall is full length. That way people walking down the aisle see the back of our monitors, not the front. But at the same time, I can look down the entire length of the aisle, which not only lets me see my other co-workers, its gives me distant objects to focus on every now and then without even having to stand up. This by far is the best of both worlds... a happy medium go figure!
Problem is everything has to be black and white all the time at the office. Nothing can ever been in between with management types because that is harder to brag about on your annual review. Either we are centralizing and standardizing on one solution for the whole company (even when that solution is a really bad choice for some things) or we innovating and being disruptive and not working together at all so that way we can duplicate a bunch of work. My suggestion that we should work together when it makes sense but don't force it always seems to fall on deaf ears.
So Microsoft starts laying off 18,000 employees in several waves starting in July this year. One of the first groups that was hit hard by layoffs was QA (mostly contract workers so they are easy to let go.) Within that, the QA department responsible for testing OS security patches was hit the hardest...
So now we are having a bunch of problems with botched updates that weren't tested sufficiently, go figure!
C is great for low level stuff since it is capable of generating machine code that has zero dependencies. K&R even explicitly mentions "non hosted mode" with no libc and implementation defined entrypoint semantics. In fact, it is the only language in mainstream use today that has this feature (aside from assembly.)
For kernel, drivers, firmware, embedded, and RTOS its pretty much the only choice unless you want to code everything in straight assembly. Since my current job is firmware programming, I've actually like the C language now that I've been forced to do a lot of heavy coding in it instead of Python.
Don't invest in and artificially scarce commodity.
You mean Bitcoin?
I know I'm going to be modded down, but it had to be said.
How, exactly, can Nvidia make games run poorly on other hardware? They don't write the games. Both AMD and Nvidia have extensive outreach programs to developers and make engineers available to game studios
That is true, but nVidia's outreach engineers have a history of checking code that regresses performance on competitor hardware. See what this Value developer has to say about "Vendor A": Vendor A is also jokingly known as the "Graphics Mafia". Be very careful if a dev from Vendor A gets embedded into your team. These guys are serious business.
How can using certain benchmarks (as you suggest) make games run slower on other hardware?
Thats not what I'm suggesting. I am suggesting that nVidia has a history of being dishonest which thier performance benchmarks. The worst case by far is during the GeForce FX era when they were caught a driver that detected it 3DMark 2001 and then only rendered content that was visible to the camera instead of the whole frame to boost thier benchmark scores. That was a while ago and I've been unable to find the original story on it.
AMD fanboy much?
Not at all, my desktop currently has a GeForce 570 installed. When I bought it nVidia clearly held the performance crown. That said, I really don't like the unethical business practices and I think I might not buy them again.
Given the fairly lame update to the Mac Mini caused mainly by the lack of choices in Intel's mobile CPU offerings (and Apple's refusal to design and stock a separate motherboard just for quad core)
Why would you be faulting Intel for this?!?
Not only that, but your argument is based on factually incorrect information. If Apple designed the new Mac Mini using the FCBGA1364 socket (high end mobile Haswell) instead of the FCBGA1168 socket (often referred to as Haswell ULT) then they could offer 2 core and 4 core minis without any board change.
The truth is Apple's designers care more about form and power dispation than having a quad core mini. Consider that the 13 inch Macbook Pro uses FCBGA1168 and the 15 inch uses FCBGA1364, just a screen size change is enough to justify a different socket and different inventory on the Macbook line up.
In fact, motherboard layout differences between the FCBGA1364 socket and the FCBGA1023 socket (used on the previous SandyBridge Mac Mini) are minimal compared to the amount of design change needed to go from FCBGA1023 to FCBGA1168. Apple has access to Intel's roadmaps >1 year before they are public, they knew that Intel would not have quad core on FCBGA1168 and they knew it would be more work to change thier design to use FCBGA1168 but they did it anyway. It was a deliberate design decision.
I no longer have faith in ANY of the conglomerates offering products all over the board.
Any conglomerate? What about 3M? They make a ton of stuff across the board and I buy a lot of their products (Scotch Tape, Post-It Notes, Scotch Brights, Nexcare & ACE bandages...) and honestly the only product I can think of that they dropped that I used a lot was their floppy disks... and they didn't really drop their floppy disk line they just sold it off as a separate company (Imation) and I was able to keep buying those floppy disks until floppys were pretty much dead and I no longer had any need for them.
When a conglomerate is well managed it actually works great, the problem is a lot of tech companies have tunnel vision and don't know how to manage a conglomerate.
Not a single person I have talked to still running 8.0 was even aware of the upgrade.Â It's not like they made a conscious choice to stick with 8.0, they simply didn't bother to even find out.
Guess you don't actually run 8.0 anymore (or you are domain joined) because on my 8.0 system a pop-up asking me to upgrade to 8.1 shows up every 2 hours after an Windows update a couple months ago.
I'm really waiting for an x86 phone that can be bought in the USA.
Looks like your wish is coming true on Oct. 24: Intel Processors to Power New Asus Smartphone Hybrid for AT&T - TheStreet
There is actually one language that I can think of used to be popular and significant that is actually now dead: PL/M
CP/M was written in PL/M (the OS that MS-DOS is based on.) Later versions of CP/M had most of the code rewritten in assembly for speed reasons. When Microsoft converted it from the 8080 to the 8086 for PCs after version 1.0 one of the things they focused on was replacing the remaining PL/M code with C code. It didn't take much time before MS-DOS was completely free of PL/M code.
Fast forward to today and there isn't a single modern PL/M compiler out there. Pretty incredible really considering that today all it takes is 1 guy deciding to spend about 6 months writing a LLVM frontend. The last one was PL/M-386, which dates to the 80's, everything newer than that focuses on converting PL/M code to C code. I would be surprised to hear about a single new software project being started today in PL/M, and I expect that the number of programmers actively writing PL/M code is a 2 digit number.
Amazing when you think about it that a language used to implement an OS which the world's most popular OS is descended from is dead now.
Because every marketing guy knows that putting an 'X' in the name of you product makes it sell better. You gotta admit, WinX looks better than Win9 in forum posts.
Seems like Microsoft is Apple's biggest fan boy these days, first they run a big marketing campaign comparing the Surface to the Macbook Air now they are trying to copy Apple's OS branding. How much you want to bet they will even throw a big cat name on there: Windows X "Puma".
You know Microsoft, imitation is the greatest form of flattery. Which is ironic since the stratospheric Apple hype _finally_ seems to be winding down. Guess Microsoft is 3 years behind everyone else as usual.