Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Boring... (Score 1) 425

Yeah, SFU is notoriously undersupported. Better than the Cygwin junk in any case. Come to think of it, Microsoft wasted a grand chance to integrate some UNIX basics into Windows. Imagine for example, if we were able to run X apps right out of the box. Windows is such a waste!

Comment Re:ZFS support (Score 1) 425

The difference between the BSD ports system and the Linux package management systems is that BSD ports are compiled natively. After using FreeBSD and OpenBSD for years, I found that the ports system does have its uses. Precompiled packages have the advantage that they can be installed right away. The ports system has the disadvantage that the default options selected for a package might not be the best choice. I often found myself editing some file before compiling a port. But the clear advantage of the solution is that options that do exist are often unavailable on systems that depend on prepackaging (like most Linux distros). The BSD package managers are OK, but they do indeed lack some features often found in Linux package managers (like auto-deletion of related packages). It is in any case advisable to read the FreeBSD manual chapters for the ports collection and the package management system. Both have their uses, advantages and disadvantages. The BSDs have become a lot simpler to use in the past years, and progress is being made everywhere just like in the Linux world. But if you want a "real UNIX", BSD is a much better choice. At least it doesn't claim it's not UNIX, like GNU does! ;-) Once you get OpenBSD to run to your liking, for instance, you'll know this system will run forever, and you'll know exactly which files to modify to get what you want. For instance, OpenBSD uses XDM which is easy to write session scripts for. I have written a couple of small programs that allow me to choose the window or desktop manager after logging in. It's fun! ;-)

Comment Re:ZFS support (Score 1, Troll) 425

GCC and the whole configure/automake crap are a nightmare to deal with (not to mention that there still is no decent documentation for the whole configure/automake process). A replacement for GCC would be a good start. BTW, for portability efforts, one should take a look at the work of Dr. Martin Richards from the university of Cambridge UK. He invented BCPL, the - more or less - direct ancestor of C, and had a system for portability called INTCODE already in 1967, that - if used in this or similar form nowadays - would be a millionfold more simple and effective than the current GNU build mechanism. Sun reinvented the wheel and called it "JVM" ... INTCODE was grand. I once had a runtime system built on it that I moved from Amiga to Windows to OS/2 within minutes ... the INTCODE interpreter being just a small ANSI C program ... the runtime system being modules based on INTCODE that just had to be COPIED. Man, just thinking about it is sort of comical ...

Comment Re:ZFS support (Score 1) 425

What's the problem with typing "man ls" to read the manual? The BSDs at least come with all necessary manpages preinstalled. No searching for manpage packages. Every package you install automatically includes the manpages. And graphical manpage browsers like the good old "xman" have some use as well, especially when browsing for system calls ...

Comment Re:Call IBM (Score 1) 655

The point with PCs is, that higher chipset and mainboard integration has led to less reliable hardware. Old 8-bit computers like the VIC-20 still run today without any problems (after more than 25 years). Systems like the Amiga were already more prone to failure (I have an A-1000 that died after more than 20 years). Old PCs like the old AT bus (ISA) 386/486/Pentium machines do still work (if the CMOS backup battery can be replaced). High-speed Pentium 4 boxes often have a mainboard failure within five years. Of course this depends on how well the system is protected from the elements like heat, etc. But definitely nowadays you can't plug just anything together and expect it to work for 20 years.

IBM offers i-Series servers (OS/400) for business applications which should have a long life as well. p-Series (AIX) servers aren't bad either. In any case, IBM hardware should be more reliable than PC hardware. I'm sure they offer spare part storage as a service as well (if it's not already included in the price). But IBM has a reputation for abandoning its products. I don't know if a proper contract can protect a customer from that.

Comment Why Not Just Update? (Score 1) 655

If the hardware is still running, there's technically no need to purchase new computers. You could replace the 500 MB drives with up to 4 GB ones (Win95 can handle that much). Do an inventory of all applications installed on the machines, and reinstall them after setting up fresh Windows 95. Or use a tool that can copy a smaller drive to a larger one. If the software works and everything, it should be best left as is. A new system might offer more speed and storage space, and reliability, but the custom application might not run anymore. There are solutions like virtualization, but you could also use Linux running DOSBox (which are both free). DOS applications are very picky when it comes to their runtime environment, so you should test your solution with your father's programs and data on a separate machine first. For durability, I have no idea what to recommend you. PC hardware has become a bit less reliable in the past years. You might try a PowerPC solution (which requires less powerful hardware), and run OpenBSD or something on it. DOSBox or QEmu might be suitable for running DOS apps on BSD. OpenBSD has the advantage that it never changes until you change it, there's no update ever until you do it manually. Another solution might be FreeBSD, which should also work very reliably. For longevity, I would abstain from all things Windows. Recent Windows platforms like XP or Vista that require activation may fail to be reactivated when Microsoft switches off its servers. Also, Windows Updates for a particular platform might be no longer available at some point. So, using a UNIX-like system might be the best idea. If you use Linux, switch off the automatic update feature to avoid breaking the system by update (does happen sometimes). I would use OpenBSD or FreeBSD for a system that needs to last for a long time.
Operating Systems

Windows and Linux Not Well Prepared For Multicore Chips 626

Mike Chapman points out this InfoWorld article, according to which you shouldn't immediately expect much in the way of performance gains from Windows 7 (or Linux) from eight-core chips that come out from Intel this year. "For systems going beyond quad-core chips, the performance may actually drop beyond quad-core chips. Why? Windows and Linux aren't designed for PCs beyond quad-core chips, and programmers are to blame for that. Developers still write programs for single-core chips and need the tools necessary to break up tasks over multiple cores. Problem? The development tools aren't available and research is only starting."

Comment Re:Of course! (Score 1) 596

it is also an echo of the "infectious licence giving away control" bullshit

No. To me, GNU is one community, and not many unrelated splinter groups. Of course, every group within the community has their own identity, goals, etc., but ultimately, they're contributing to one big goal. It's just a, say, unifying approach.

Comment Re:Costs or Price? (Score 1) 524

Their costs may drop but are we going to see a reduction in price?

The featured article talks about exactly that. HP currently develops a technology to print the backplane and E-Ink film. This will drop the prices in the long run below those of LCDs, and we'll get flexible plastic displays with E-Ink technology.

Does it run Linux? Yes, it does! :)
The developer kit for E-Ink comes with a display and Linux. There's also an X driver for the new E-Ink GPU already.

Color displays are also in the make, but that might take another 2 years or so.

Comment Re:Of course! (Score 1) 596

Well, any project is free to choose its licenses, its build tools and so on. As long as GNU is involved, its part of the GNU "project". The purpose of GNU is not "ownership", it is to provide a unified UNIX-like platform for applications to run on. In a sense it is like Java, except GNU has that convoluted build process that requires compiling. And GNU does not protect one from platform dependencies (like when an application is using Linux/BSD/Hurd system calls directly). At least, all "GNU" applications (those that use the GPL, the build process, etc.), share one common thing: They're free open-source software. Thus, the scope of GNU is far bigger than just a handful of people working directly for GNU. It encompasses all developers contributing to the GNU infrastructure. It's huge, it's swift, and it will kill Windows.

Comment Re:Of course! (Score 1) 596

In addition to my other reply, one can regard every application running on GNU as part of the GNU infrastructure, as every application running on Windows is part of the Windows infrastructure. Insofar is it possible to consider GNU as the operating system and Linux as the kernel of the operating system. Furthermore, most GPL-ed applications use the GNU build process (and build tools), the GNU libraries and the GNU compilers. Even on systems like BSD (where GNU is sometimes considered as encumbering), GNU tools and applications using GNU are everywhere. If you want to be GNU-free, you have to create your own C/C++ compiler, linker, shells, shell tools, runtime libraries in addition to the kernel. Also, if you wish not to be associated with the FSF, perhaps you shouldn't use a license with a FSF copyright in it. But that's just my opinion right, who am I to talk to you wisecrack.

Slashdot Top Deals

"Little else matters than to write good code." -- Karl Lehenbauer

Working...