Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment postgres is not oracle (Score 1) 372

I'm a big fan of postgres and have been for many years. I consider it to be an excellent RDBMS.

That said, postgres is not a replacement for Oracle. Oracle has a large number of enterprise features which are too numerous to list here.

One important thing which springs to mind is the lack of index clusters in Postgres, even in the most recent versions. The postgres equivalent (cluster table using index) is just not the same thing at all, or even close. This by itself could easily cause some complex queries to take more than 5x longer in postgres.

Another important thing is that postgres has no equivalent to RAC or other clustering technologies available in commercial RDBMSes. Hot standby is not the same thing.

There are many other examples that are too numerous to list here.

The Media

PCWorld Magazine Is No More 164

harrymcc writes "After slightly more than 30 years, PCWorld — one of the most successful computer magazines of all time — is discontinuing print publication. It was the last general-interest magazine for PC users, so it really is the end of an era. Over at TIME, I paused to reflect upon the end of the once-booming category, in part as a former editor at PCWorld, but mostly as a guy who really, really loved to read computer magazines."

Comment Re:Makes sense (Score 1) 566

If you're on a tiny system however this is problematic.

I grant that tiny systems can be a good reason to use proprietary binary formats. With tiny systems, the CPU overhead of comrpession can be excessive.

The definition of "tiny" is changing though. Previously, cell phones used to be "tiny" insofar as they had really small CPUs and little ram. Now even low-end smartphones have 1GHz out-of-order dual-core processors, for which the overhead of compression would be negligible.

Comment Re:Makes sense (Score 3, Insightful) 566

I disagree. I'm an old enough programmer (in my 40s), I started my career working with proprietary binary formats, and I remember the good reasons why binary formats were abandoned. Where I work, the older someone is, the less likely they are to favor binary formats for structured data (this argument has come up a lot recently).

I'll repeat one or two of the arguments in favor of not using proprietary binary formats.

If you wish to save space, conserve bandwidth, etc, then binary formats are not a good way of accomplishing that. The best way of saving space and conserving bandwidth is to use compression, not a custom binary format! Binary formats are still very large compared to compressed xml, because binary formats still have uncompressed strings, ints with leading zeroes, repeating ints, and so on. If you wish to save space or conserve bandwidth, then you ought to use compression.

If you use compression, though, then using a binary format also, gains you nothing. Binary formats do not compress down any further than human-readable formats that encode the same information. You won't gain even a few extra bytes on average by using a binary format before compressing. It gains nothing to use a custom binary format if you compress, which you should do if you're concerned about space or bandwidth.

Of course, compressed formats are binary formats. However, the compression formats you will use, are extremely common, are easily identified from a text identifier at the beginning of the file, and have widespread decompressors available on almost all platforms. Gzip, Bzip2, and zip are installed by default on the macbook pro I got from work. They're almost everywhere. That is not the case for a custom binary format which you create. Also, compression can be turned on and off. If you wish to sniff packets for debugging, you can turn compression off for awhile.

Here's a different way of putting it. You can think of common compression algorithms (such as bzip2) as mechanisms for converting your files into the most compact binary representation available with no programming effort from you. It does not help those algorithms if you also try to do binary encoding yourself beforehand.

There are a few weird exceptions where it's best to use binary formats. There are small embedded devices which lack the hardware to perform compression. Also, http/2.0 might be an exception, because the data transmitted is less than 100 bytes usually, so adaptive compression wouldn't work well, and it wouldn't be possible to compress across http requests because Http is (in theory) a stateless protocol.

Now though, even private internal saved state never seen by a human is done in XML for bizarre reasons.

There are reasons other than human-readability to use XML. Using xml means you gain an ecosystem of tools: all kinds of parsers, generators, code generators, validators, editors, pretty-printers in your IDE, network packet sniffers that can parse and pretty-print it, etc, on a wide variety of platforms. You lose all that if you roll your own binary format, for a gain of nothing if you compress the data in production.

Also, private internal state is seen by a human on rare occasion. What happens if parsing the file fails? Someone will need to look at it.

Comment Microfiche or microfilm (Score 1) 329

I'd bet you could pick up some used microfilm or microfiche equipment from an old library, newspaper, or business. That equipment was standard during the 1970s, and I'd guess there's still a lot of it around.

You can store hundreds of pages on a single small roll of microfilm.

Canon still makes equipment to scan microfilm into digital formats.

Comment Re:Breaking news (Score 2) 298

Breaking news - you're a clueless git who no more understands the situation than my keyboard does. But that doesn't stop you from typing platitudes,

Speaking of clueless...

I like non fiction submarine books (for example), Amazon figures this out... and I'll never see a sale price on a submarine book again. I ordered the DVD of A Certain Scientific Railgun last week, and today the manga was a higher price than it was two weeks ago.

Nope. Amazon's prices fluctuate often, based upon supply and demand. You saw that, and then you wrongly inferred that they were discriminating against you, and charging you higher prices based upon your prior behavior.

you're a clueless git who no more understands the situation than my keyboard does

You may consider growing up before posting.

Comment Re:Resolution (Score 1) 397

Vista reverts to "fractional scaling", where it simply does a bilinear upscale of the application window, resulting in a blurry, god-awful mess where nothing was rendered natively.

I have to say I've never seen this on either XP or Win7. Perhaps it helps to have exactly 200% scaling so everything can be enlarged exactly. But as far as I can see, programs that aren't scaling aware (such as the command prompt window) are just rendered unscaled. Maybe it is because I have Aero turned off.

Comment Re:It's clever, no? (Score 4, Insightful) 577

Coal plants have already been shutting down due the fact that natural gas is cheaper. Since we've been building natural gas plants, our carbon emissions are down to 1990's levels. Funny thing, we didn't even sign Kyoto, yet we did better than most (all?) countries in reducing carbon.

Comment fluorescent lighting (Score 1) 532

I still hear people complaining about fluorescent lighting despite the fact that CLF's have electronic ballasts that use extremely high frequencies. I could understand the old, old lights that used magnetic ballasts, but CLF's? Really? Seriously? People can see 40,000Hz on a properly working tube bulb? It is not like a monitor with tiny phosphors where I could see the scanning. LED's flicker way more than I ever noticed fluorescent lights. To make matters worse, LEDs are used in many more places! I noticed the flickering from the taillights in newer cars, gadgets, LED equivalent bulbs that dim, etc.

Slashdot Top Deals

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...