Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
User Journal

Journal Journal: Are distros worth the headaches? 6

One of my (oft repeated) complaints about standard distributions such as Gentoo, Debian or Fedora Core, is that I slaughter their package managers very quickly. I don't know if it's the combination of packages, the number of packages, the phase of the moon, or what, but I have yet to get even three months without having to do some serious manual remodelling of the package database to keep things going. By "keep things going", I literally mean just that. I have routinely pushed Gentoo (by doing nothing more than enabling some standard options and adding a few USE flags) to the point where it is completely incapable of building so much as a "Hello World" program, and have reduced Fedora Core to tears. That this is even possible on a modern distribution is shocking. Half the reason for moving away from the SLS and Slackware models is to eliminate conflicts and interdependency issues. Otherwise, there is zero advantage in an RPM over a binary tarfile. If anything, the tarfile has fewer overheads.

Next on my list of things to savagely maul is the content of distributions. People complain about distributions being too big, but that's because they're not organized. In the SLS days, if you didn't want a certain set of packages, you didn't download that set. It was really that simple. Slackware is still that way today and it's a good system. If Fedora Core was the baseline system and nothing more, it would take one CD, not one DVD. If every trove category took one or two more CDs each, you could very easily pick and choose the sets that applied to your personal needs, rather than some totally generic set.

My mood is not helped by the fact that my Freshmeat account shows me to have bookmarked close to three hundred fairly common programs that (glancing at their records) appear to be extremely popular that do not exist on any of the three distributions I typically use. This is not good. Three hundred obscure programs I could understand. Three hundred extremely recent programs I could also understand - nobody would have had time to add them to the package collection. Some of these are almost as old as Freshmeat itself. In my books, that is more than enough time.

And what of the packages I have bookmarked that are in the distros? The distros can sometimes be many years out-of-date. When dependencies are often as tightly-coupled to particular versions as they generally are, a few weeks can be a long time. Four to five years is just not acceptable. In this line of work, four to five years is two entire generations of machine, an almost total re-write of the OS and possibly an entire iteration of the programming language. Nobody can seriously believe that letting a package stagnate that long is remotely sensible, can they?

I'll finish up with my favorite gripe - tuning - but this time I'm going to attack kernel tuning. There almost isn't any. Linux supports all kinds of mechanisms for auto-tuning - either built-in or as third-party patches. And if you look at Fedora Core's SRPM for the kernel, it becomes very very obvious almost immediately that those guys are not afraid of patches or of playing with the configuration file. So why do I end up invariably adding patches to the set for network and process tuning, and re-crafting the config file to eliminate impossible options, debug/trace code that should absolutely never be enabled on a production system (and should be reserved solely for the debug kernels they also provide), and clean up stuff that they could just as easily have probed for? (lspci isn't there as art deco. If a roll-your-own-kernel script isn't going to make use of the system information the kernel provides, what the hell is?)

User Journal

Journal Journal: Highly Secret Open Source Projects 7

Nothing in this world will ever be more confusing than projects that are:
  1. Released as Open Source on public web sites
  2. Bragged about extensively on those websites - especially their Open Sourceness
  3. Never to be mentioned or referenced in any way, shape or form by anyone else

Pardon me for my obvious ignorance of the ways of the world, but it would seem obvious enough to even the most demented that once something has been posted on a public site that other people WILL find out about it - from search engines if by no other means.

It would also appear that secrecy and Open Source are mutually exclusive - if you publish the source under a GPL or BSD license, it's rather too late to start whining if others then start poking around the code. I'm not talking about people distributing closed-source and having people try to reverse-engineer or reverse-compile it. That's different. I'm strictly talking about code where the source is open to everyone, where the license is explicitly stated, and the license is - beyond all doubt, reasonable or otherwise - one of the standard Open Source licenses that we all know and love/hate/have-a-strong-opinion-on.

So what gives? Why do we have cases of individuals or organizations who obviously want to take advantage of the Open Source model but who do everything in their power to violate that same model (and possibly even their own licensing scheme)?

I'll offer a suggestion, and those guilty of the above offense will likely take even greater offense at this. I believe it is because Open Source has become the "in thing". It's "hip" to release something Open Source. It's fashionable. It's highly desireable. In some circles, it might even be considered sexy. So what's wrong with any of that? When these are the only reasons, there is a LOT wrong with it. When Open Source ceases to be open and has even less to do with the source but is solely used as a substitute for some perceived genital defect, it ceases to be Open Source. I'm not sure what you'd call it, but it has nothing to do with any community that has even the vaguest understanding of either openness or freedom.

So what should these people do? I'm not going to say that they need to do anything at all, other than be honest. If these programs are "invite only" or to be circulated only amongst friends, then get them the hell away from the public part of the web and use a .htaccess file to restrict who can get them. Or put them on a private FTP site where you can control who has the password. Or only e-mail them to people you like.

Why? There's one very good reason why. If you advertise something as Open Source, offer it as Open Source, post it as Open Source, license it as Open Source, but deny the entirety of Open Source civilization any rights that are explicitly or implicity granted by doing so, purely because they're not your type, they aren't the ones in the wrong. If you offer someone a hamburger but then give them a slice of pizza, they aren't being ungrateful swines if they tell you that's not what you offered.

This particular resentment has been brewing in me for some time, but some projects on Freshmeat recently got closed to editing and then willfully broken by the software developers concerned. Why? So that nobody would bother them. Get a few thousand extra eager eyes looking at the code and you needn't worry about being bothered, although you might have to start screening out all the screaming F/OSS fans who want a glimpse of the next megastar.

I guess I'm posting this today, right now, in a time that has traditionally (well, since the time of the Saturn cults in ancient Rome, at least) been associated with sharing far more than any other time, because the Grinch is not merely alive, well and extremely evil, he's now burning the houses down as he leaves.

User Journal

Journal Journal: Treasure-Seekers Plunder Ancient Treasure 2

The lost treasure of Dacia is the target of treasure seekers in search of an estimated 165,000kg of gold and 350,000kg of silver that was hidden shortly before the Romans destroyed the region. Tens of thousands of solid gold artifacts have already been located and smuggled out, apparently after bribing Government officials and police.

There are many schools of thought on this sort of thing. There are those who would argue that the treasure was hidden quickly, so there is no archaeological information lost, provided some examples remain to be studied. Museums and galleries often end up with stolen objects anyway (the Getty museum and the Louvre being recent examples), so in all probability they aren't going to be lost to society. Besides which, many countries treat their national heritage disgracefully, so many of these stolen items might actually be treated vastly better than they would have been.

I don't personally agree with the methods, there, but Governments have shown themselves totally incompetent at protecting either national heritage or world heritage, so if there is to be any heritage at all, it is going to have to receive protection from somewhere else.

Another school of thought is that such excavations should be performed by trained archaeologists, who can document everything in detail, who are trained in the correct way to preserve every last iota of information, and who can ensure that nothing is lost.

Again, there is a lot to be said for this approach. Except that archaeologists are poorly funded, have a tendancy for naivety when it comes to dealing with people (Seahenge, Dead Sea Scrolls, etc) or indeed the information they collect (Seahenge, Dead Sea Scrolls, etc). Their interpretations often fail to properly document sources and are prone to speculation where little evidence exists. Archaeologists simply don't have the means to carry out the kind of excavation required, and wouldn't necessarily have the skills required even if they did.

The third option is to leave the stuff where it is. The world has moved on, let it rest in peace. We already have a lot from that period of time, why would we need a few hundred thousand items more?

This line of thinking ignores that everything has a story to tell. It assumes some sort of equivalence between ancient art (where everything was unique and took time and skill to make) and modern art (where everything is mass-produced on a production line). It also assumes that such ancient artifacts take up space that we need. The world is a BIG place. Things can be moved. Or built around, as happened with the Rose Theatre in Stratford-upon-Avon. We're not even restricted to the two dimensions of the surface.

The last option is a mix of the above. Archaeologists rarely need the original object, museums never do. We can etch the surface of an object to within a few tens of nanometers, can identify the composition of a dye or paint an atom at a time, and can read long-erased writings from trace amounts of residual molecules.

This approach would argue that whilst archaeological context is the ideal, vast amounts of information could very easily be extracted from any collected item, if anyone could be bothered to do so - certainly far more than necessary for a museum to exhibit ancient history that would otherwise be lost, and probably far more than would be needed by archaeologists to produce extremely detailed conclusions and infer the vast majority of information they'd have collected if they'd dug the items up themselves.

In the end, I want the maximum information possible to be preserved and for the artifacts to be protected and preserved as best as possible. Plundering is probably not the best way to achieve this. Anything not gold or silver is likely being destroyed, in the Dacia site, for example. But if it ends up with anything being salvaged at all, it'll be an improvement, as bad as it is.

Better solutions are needed, but it is doubtful any will be developed in time to save those things that need such a solution to survive.

User Journal

Journal Journal: ...And here's an overkill I produced earlier... 3

Probably not many non-UKians are familiar with the cult children's TV show "Blue Peter". To cut a long story short, it is one of several arts, crafts and high adventure TV shows in the UK intent on destroying the world economy by building fully-working space shuttles out of cardboard, glue and sticky-back plastic. And here is one I prepared earlier.

They also hand out badges to children, between the ages of 5 to 16, for significant achievements. You might get a blue badge for running a bring-and-buy garage sale that completely rejuvinates the local economy. A gold badge might go to a kid who swims unaided through the flood waters during a hurricane to rescue little old ladies.

As a mark of respect for the wielders of The Badge, many places offer discounts or free entry to the heros, inventers and artists who have achieved these heights. Well, that changed nine months ago.

Nine months ago, the worst crime imaginable occured. People were caught selling their Blue Peter badges on e-bay! Arguably, the badges belong to these people so they have a right to sell them. Right? Well, that's where it gets tricky. The badges DO belong to those individuals, but the badges carry a lot more than just some painted steel, it has a measure of respect.

The solution to the problem was unveiled on Monday, June 19th. There is to be an identity card (with holographic image of the person awarded the badge) that goes along with the badge. With the hologram, it no longer matters if something gets sold, as nobody else will look like the person in the photo.

Of course, you know where this is going, don't you? Holograms will eventually be replaced or extended by biometric data. Not horrible, surely? Well, the UK has been fighting hard to add a national ID card - and losing - for some time. This would allow them to pervert a glorious medal of honor into a scheme allowing national ID cards to be seen not as a mark of being watched, but as a mark of achievement.

By getting kids to WANT national ID cards for themselves, the problem would be easily solved. By getting kids to beg for more of their personal data to be carried, the Government won't have to fight to get National ID installed, they'll have a fight to prevent exposure of FAR too much personal data.

Five year olds have barely the wherewithall to learn to read. We do NOT need them to have to combat identity theft and spin doctoring at the same time.

User Journal

Journal Journal: When is a web design good? 5

One big problem with a lot of web sites is that they are poorly designed, if they're designed at all. They operate on a very limited number of platforms - some will only work at all with specific versions of IE on specific versions of Windows. Some are hopelessly cluttered with every possible feature web browsers have to offer, creating a mess that is quite unusable. Some concentrate on the "feature of the day". This used to be backgrounds (making the text unreadable). Later generations made the page hopeless by relying on plugins and embedded scripts of one kind or another. Web 2.0 - the fad of the moment - is the latest way people can seriously mess up what would otherwise be a superb site.

Is it all bad, though? No, all these things are mixed blessings. People have used them all in very effective and productive ways, making sites far more navigable, far more readable and far more elegant. In general, this is because the person behind the idea has actually thought it through and designed the site correctly.

Funny how this journal entry comes up as Slashdot moves to the new CSS, and a Web 2.0 story is on the front page... Well, no, it's not a coincidence. Slashdot's new look & feel is a perfect example to draw on, as it's something we're all using to access this journal entry. The new l&f is undoubtedly crisper and cleaner than the previous format. I liked the old look, it was extremely functional, but the new layout has some definite improvements in my humbly arrogant opinion.

HOWEVER, every feature present offers two chances for problems - it might be implemented incorrectly in the web page, or implemented incorrectly in the browser. As "novel" features aren't necessarily going to be retained, if they don't prove to be useful in practice, every feature present also risks being unusable in future browsers, This is not to say you should never use new features, only that you should be aware of the risks involved and work to mitigate them sensibly. You should also not assume browsers function correctly and have suitable provisions.

(It is not possible to test a page on every web browser in existence, but if the features are imported into the page dynamically, it should be easy enough to select alternatives for specific browsers when a problem is identified.)

Most of this is common sense, but it drives me nuts that so many pages on the Internet today DO violate every ounce of common sense for the sake of looking better to the select, when they could look better in just the same way to everyone at very little expense.

User Journal

Journal Journal: Tenth Planet - For Real!

Hot on the heels of the discovery of 2003 EL61 comes another planetary body in our solar system. Designated 2003 UB313 and nicknamed Lila, this planet is at a distance of 97 AU and is at least the size of Pluto, possibly up to twice as big. Based on the current definitions, this would make it an Official Planet. Nasa's Jet Propulsion Laboratories has more information and pretty pictures.

User Journal

Journal Journal: Ice on Mars

A significant disk of ice has been found on the inside of a large impact crater. The ice is very visible and very much on the surface. According to the BBC, "The highly visible ice is sitting in a crater which is 35 km (23 miles) wide, with a maximum depth of about two km (1.2 miles)." The European Space Agency has more details and bigger images.

User Journal

Journal Journal: Bright Trans-Neptunian/Kuiper Belt Object Found

Right now, details are very sketchy on this new discovery, so I'm putting it in the journal, rather than trying to post it as a story. Essentially, European astronomers have found something they call 2003 EL61 and what American astronomers call K40506A.

There are questions on how reflective the object is, which means we don't have that much information on how big it is or how far away it is. The guesses by astronomers, at this point, are pretty speculative, according to the BBC, which is tracking this breaking story.

NASA has published a wild guess as to the orbit, in Java.

The other known super-large (1000Km or bigger) Kuiper Belt objects are:

  • Sedna (Diameter unknown, less than 1500 Km)
  • 2004 DW (Diameter probably about 1500 Km)
  • Quaoar (Diameter of 1200 Km, +/- 200 Km)
  • Ixion (Diameter 1065 Km, +/- 165 Km)
User Journal

Journal Journal: Telefantasy Missing Episodes

As many televsion (and film) fans are aware, many early - and not so early - recordings have been destroyed over time. Ignorance has played a large part in this. The BBC, for example, famously claimed that nobody wanted to watch Black & White stuff any more.

We now know, of course, that this is completely untrue, and that the owners of said companies would have been able to figure this out if they'd bothered to go out and ask anyone.

However, not all hope is lost. Most, but not all. Every so often, a recording that has been missing, believed destroyed, surfaces. Once in a great while, it is a large haul, most times it is a single episode of some popular series.

Three missing episodes have surfaced, in recent years. Most famously, episode 2 of the Doctor Who story "The Crusades" was located in New Zealand. Not so famously, but just as significantly, episode 2 of "The Dalek Masterplan" has been recovered. This was found in January 2004, although there is very little publicly known. I have yet to see any comments even on the condition of this latest discovery.

In another series, The Avengers, a missing episode has turned up. "The Girl on the Trapeze" is a first season story, featuring Dr Keel (played by Ian Hendry). Steed (played by Patrick McNee) does not appear in this story. This is only the second story to have been recovered from the first season. As such, it is practically like gold dust. Accounts suggest that it's in good shape, but there is absolutely no word as to who is going to do the restoration work (if any is needed). Nor are the companies most involved with The Avengers (such as A&E) even remotely skilled in this area. This simply isn't their field. Unfortunately, it's not like you can get another master tape, if you destroy the only known surviving one.

This brings me to my next point. Restoration work of any kind is a specialist field. However, the number of specialists in it is very small, the resources they generally get are limited, and they have to split the time between restoration and doing stuff that pays them money.

This is a stupid, half-baked way to go about doing things. If collectors are out there, hoarding tapes, it's not terribly surprising. I'd not willingly hand over something unique, valuable and precious in the eyes of a great many people to the hands of people I'm not certain can do the job but where I am certain they'll try in ways that could be very destructive.

Now to my final point. It must be obvious (3 finds in 7 years is a pretty high ratio) that other missing material exists. Very very little from the British ABC, Thames TV, the BBC, etc, from before 1980 still exists. They even trashed their copies of the Apollo 11 moon landings!!!

What have they done to rebuild their collections? Well, they've threatened finders with copyright lawsuits. They've sent rather garbled letters to various TV studios (but never followed up on any of them). That's about it. No "finder's fee". The BBC will let the finder keep the original once it's been duplicated, but other TV and movie companies don't even do that.

In short, nobody is exactly making a determined effort to uncover those episodes still out there that might be salvagable. The ones who could - the corporations - are happy making loads of money off their newer products. Old lines don't really interest them, money-wise.

Smaller groups, and individual fans, have almost zero hope of finding anything. Unless they happen to live near a TV station. Some Dr Who stories have been discovered by accident in forgotten parts of archives, or even in trash bins outside. Nor does the average person have the money it would take to do the detective work it'd take to track down any episodes in the hands of collectors.

What hope, then? Probably none. Unless TV companies realise that although the past won't make them as rich as Bill Gates, the past is still desirable to viewers and would still make more money than doing nothing.

User Journal

Journal Journal: Portland, Oregon

A short diary entry, for a change. I recently moved to Portland, Oregon, as it's probably the most high-tech place in the US outside of California. (It's also a lot cheaper.)

However, there really aren't as many jobs as all that, and those that do exist are massively over-applied to. The State has one of the nations' worst unemployment rates and most new jobs are in the low-end service industries.

PDX is an interesting place (if you like books - or think you are a book) and is probably one of the most progressive cities in America. It still falls a little short of perfection, however. As well as there being a lack of jobs, I've noticed a lot of polarization. You really don't need a resume, if you've got a street address.

On the more progressive front, it seems to have a very strong Linux following (proof of enlightenment? Or maybe some other window manager). Public transport in and around Portland is excellent and seems better run than that in much of Europe or Britain. The Internet Cafe's are definitely good. It even has a pet volcano (Mt. Tabor) inside city limits.

User Journal

Journal Journal: Explorer 2000 digital cable boxes

The Scientific Atlanta "Explorern 2000" series of digital boxes are intriguing. A mini-Sparc processor, USB, firewire and ethernet ports, an incredibly slow OS (although it's pretty stable - I've only crashed it once), and controls that don't work as expected.

As systems go, the hardware is great, but the coding is horrible. For example, you can't control volume through the box. Huh?!? The signal goes through it, so why not?

What are the ports for? Apparently, nothing. That part of the code was never written.

Poor MPEG encoding makes the box do worse MPEG decoding. Artifacts are commonplace on digital boxes. Anti-aliasing and other smoothing techniques escaped the makers completely.

The more recent "upgrades" are worse, having boot-up times comparable with Windows 2000 on a low-end Pentium.

Enough of the rant. Does anyone know how to upload Linux into this beast of burden, in particular in a way that lets it still work?

User Journal

Journal Journal: Closed standards 1

With the introduction of PCI-X 2.0, the world has yet another closed standard. Only individuals who are employed by paid-up member organizations of PCI-SIG are able to read that specification. What does this mean?

Well, the first thing it means is that it's a lot harder to get Open Source/Free software that takes advantage of the new hardware. After all, member organizations may well write Open Source/Free drivers, but how complete are they? Do they make best use of the capabilities? Are there capabilities that aren't implemented? We don't - and won't - know. There's no way to know.

PCI-X 2.0 is only the latest in a long line of such TecSpec Substance Abuse. ISO and CCITT (now ITU) are notorious for limiting access to specifications, Microsoft claims not to have any, and SCO just sues those who might have them.

As well as programming, I design. But I can't design if I have nothing to design with! Without a spec, I cannot do anything for or with closed standards for software or hardware.

It is no coincidence that there are no Open Source X.400 e-mail servers. X.400 is certainly a more comprehensive standard than SMTP, and it certainly supported attachments a long time before SMTP servers/clients did. It's a very powerful system. It's also a very dead system. Powerful is irrelevent, when nobody can use it. All the power in the world attains nothing, if it's locked away, out of reach of those who need or want it. People go for the alternatives.

Intel tried to lock up the specs for their processors and chipsets, so rival companies produced incompatiable versions. Those rival companies are bashing huge chunks out of Intel's market, and Intel can't claim that back -- their system is (by definition) equally incompatiable with those of their rivals.

The danger for PCI-X is this -- if a rival standard is developed, that's about as fast, preferably cheaper to implement and - above all - open, PCI-X will be as dead as PS/2. Which, incidently, was also a closed standard. And expensive.

Thus, I throw open a challange. There's no "prize", other than bragging rights (and the possibility of world domination). Sorry. The challange is to produce a fully-Open specification for a bus that competes with HyperTunnel and PCI-X on performance and capability, while being cheaper than either to implement.

It doesn't matter if this spec is never implemented. What matters is whether the mere possibility of a threat to the closed model can do for hardware what the IETF, Linux, *BSD, the FSF, et al, have done for software. Be present enough to terrify the dinosaurs, force them to move their mind-sets 250 million years forwards into the modern-day era.

User Journal

Journal Journal: Sisters of Mercy

Once upon a time, a long, long way away, there was a group called Metallica. A member of said group complained about Napster, yet admitted that he had trouble with e-mail.

On the other side of the galaxy, there was another group. "The Sisters of Mercy". This group consisted of an Eldrich voice, a drummer second only to the guy from Def Leppard (hey! that's what their web pages say!), and assorted guitarists over time.

Their drummer, a Doktor Avalance, is apparently a military-grade, heavily-armoured synthesizer system linked to an aging ISA-era PC.

Their knowledge of technical stuff is fascinating, with a basic familiarity with real-time systems, hardware compatibility issues, high-availability system architectures, UNIX and DOS.

This goes a bit beyond the "can't figure out e-mail" stage, and is strongly indicitive of geeks who hack in the audio spectrum, as well as in hardware and software.

Whether you like "The Sisters of Mercy" or not is not the point. Imagine an age in which the bands and labels were technically-aware. Where they could understand the difference between a bit-bucket and KFC basket. Where they understood what technology did, because they did it.

We're not going to see such an age, in music, simply because there are way too many airheads who have way too much money. What we can do, though, is cheer on those who do make the effort, and to remind the monotonous, slavish Voice From The Deep that it is the ignorant who are complaining, and the wise who simply learn.

Slashdot Top Deals

Work continues in this area. -- DEC's SPR-Answering-Automaton

Working...