Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
News

Top Ten Algorithms of the Century 245

brian writes "'Great algorithms are the poetry of computation,' says Francis Sullivan of the Institute for Defense Analyses' Center for Computing Sciences in Bowie, Maryland. He and Jack Dongarra of the University of Tennessee and Oak Ridge National Laboratory have put together a sampling that might have made Robert Frost beam with pride--had the poet been a computer jock. Their list of 10 algorithms having 'the greatest influence on the development and practice of science and engineering in the 20th century' appears in the January/February issue of Computing in Science & Engineering. If you use a computer, some of these algorithms are no doubt crunching your data as you read this."
This discussion has been archived. No new comments can be posted.

Top Ten Algorithms of the Century

Comments Filter:
  • I'm not surprised that the Fortran compiler is included. Like it or not, Fortran is the grand-daddy of high-level languages for science and numerical apps.

    Like it or not, "The Fortran Optimizing Compiler" is not an algorithm. There are numerous Fortran compilers available, and each has its own algorithms (note the 's') for performing optimization. It's like saying "what's the best vehicle of the 20th century" and answering "interstate freeways". Yes, they help transporation, but...
  • Are any of these algorithms related to things like compression algorithms (MP3, Zip, LZW, etc.), encryption algorithms (DES, RSA, Blowfish, etc.) or search algorithms (like the one Google uses, for instance)?

    Not really. They are much more important than that. Essential for putting man on the Moon, solving air flows to optimize the shape of a jet fighter, protein folding calculations, you know, this kind of stuff. These algorithms made it possible to solve large equations back in the 60's (imagine the processing power back then...)
  • Ducks, runs, hides.
  • That's an interesting rant about the FFT, but it seems to me that the Fourier Transform was invented by Gauss in the 1800's, while the <b>Fast</b> Fourier Transform is what was invented in the 60's. Given that you use the two interchangably, I don't know how much of your comment to believe.
    --
    No more e-mail address game - see my user info. Time for revenge.
  • Actually, the commerical out and out
    suggested using 2 tablets for faster relief.

    And showed 2 tablets falling into the water, etc.

    Sales did nearly double, however. True story! :)
  • HAH! Now that's progress! It's funny, it's a self-extracting archive, but it has no compression. I did a rar of the .ocx it extracted, and it went to 45K. Even as a self-extracting .exe it was only 63k
    ---
  • Although not yet fully realized, Shor's algormithm for factorization blows the lid off encryption, and is an excellent example of quantum parallelism, and inteference.
  • To me, it's a glaring ommission to not have any string matching algorithms. In other words, where are the regular expressions?!? One of the most impressive algorithms I have ever seen is related to regular expressions. There are two types of regular expression algorithms. On follows directly from the regular expression, and the complexety of the regular expression directly correlates with the running time of the algorithm. With the second type of algorithm, the running time of the algorithm is proportional to the length of the string, and the length of the regular expression. In other words, you can always predict how long it will take to run. No matter how complicated. The first kind is NFA, or non-deterministic finite automata. The next kind is DFA, ie. deterministic finita-automata. Anyway, it is really easy to write a NFA, but writing a DFA is super complicated, it involves this awesome thing called the E-closure. The algorithm to product a DFA via the E-closure is the impressive algorithm I think should have been on the list.

    I heard about this algorithm in the book: Compilers: Priciples, Techniques and Tools (The Dragon Book, Second Edition) It doesn't have a name for the algorithm though. Anyone who is interested in regular expressions, parsers, lexers, compilers, interpreters and validators should deffinatly check out this book.

  • The travelling salesman problem can be solved using simulated annealing which is the Metropolis algorithm for Monte Carlo.
  • What's with all the numerical algorithms? Isn't this supposed to be about computing rather than engineering? The algorithm/data structure that comes to mind for me is the B-Tree. Database systems would be a lot less useable without it. "Your fingers are soaking in it now."
  • I don't use the two interchangeably. Maybe I worded the post unclearly.

    Gauss invented the Fourier transform *and* the FFT before Fourier invented the Fourier transform. Fourier never figured out the FFT, as one poster has pointed out. If that was unclear from my original text, then sorry about that.

    Gauss basically came up with the FFT out of necessity... he was processing a ton of planetary observation data by hand.

    Anyway, you say you "don't know how much of my comment to believe", but that's why I included references, eh. The FFT has been invented many times in the 20th century, all of them rediscoveries of what Gauss did. Check any signal processing book that has historical notes.

    -N.
  • by Anonymous Coward
    Understand that this list was compiled by a magazine published under the auspices of the American Institute of Physics. So of course the "top ten algorithms of the century" are all going to be numerical computation nonsense plus quicksort as the token non-science entry :-) What an absurd bias.

    The magazine printed this big poster of these top-ten algorithms, and a numerical computation person across the hall from us has had it pinned to her door for a month now. We've had so much fun mocking it these last few weeks.

    So here's a short list of algorithms that, I guarantee, are far more important than Fast Multipole. :-)

    1. A*. It's just absurd that A*'s not on their list. A* is the basic deterministic search algorithm behind game playing, planning, scheduling, knowledge representation, and a large chunk of symbolic AI. Why should it then be on a top-ten list for science and engineering, you ask? Because A* is the smarts behind the symbolic integration, root finding, closed-form symbolic sum determination, algebraic simplification, and other automated mathematics procedures found in MACSYMA, Mathematica, MathCad, Maple, and now your handy-dandy HP calculator. Mathematicians and science theorists for the last fifteen years have relied heavily on A* to help them find solutions and prove difficult theorems underlying all of science.
    2. B+ Tree Manipulation. Revolutionized operating systems and databases.
    3. Dynamic Programming. And they call themselves a scientific computing group. Dynamic programming has done more for engineering than practically the entire list of theirs combined.
    4. RSA and DES. You're gonna tell me that Krylov is more important than these?
    5. Z-Buffering. Which made graphics economical.
    6. Knuth-Morris-Pratt String Matching. The guts behind information retrieval systems.
    7. Graph Algorithms (MST, Shortest Path, etc). They didn't list a single one. Yet these algorithms are vital to networking, databases, an astounding number of computer science fields, and much of the infrastructure of the internet.
    The published list is just idiotic.
  • by drew ( 2081 ) on Sunday June 11, 2000 @12:33PM (#1010094) Homepage
    the significance of the fortran compiler is not necessarily because of any particular optimizing techniques that the compiler used. Rather the invention of the optimizing fortran compiler is significant because it was the first time that a compiler was written that could be as efficient or more efficient than a reasonably experience programmer doing his own hand rolled assembly. The compiler was written specifically with the goal of being efficient at numerical operations while still using normal mathematical notation (ie. w = x + y + z; as opposed to add w, x, y; add w, w, z; ) this allowed programmers to be much more efficient without sacrificing execution speed. this was a vastly innovative concept as the time, as the reigning concept at the time was that no good programmer would use a compiled language, it being too much slower than writing the program in straight assembler.

    it was also instrumental in the vast popularity of the (IBM?) computer it was originally developed for and sold with.

    disclaimer: this is all stuff that i remember from my programming languages class last year. it may not be completely correct, as i don't really know anything about fortran other than what i learned in that class...
  • The truly optimal encoding isn't Huffman coding --
    although it's inspirational. Arithmetic coding
    has beautiful properties that allow for partial
    bit codes.
  • Muh!

    How To Write A Letter
    1) Get Pen
    2) Get Paper
    3) Apply pen to paper
    4) Move pen according to the following movement pattern:
    5) (etc)

    The above is certainly an algorithm. But try porting it to X-Windows. Just because an algorithm relies on certain other things being present (pen and paper in this case, specific computer architecture in the other) doesn't make it less of an algorithm.

    Every who'se complained about the Fortran compiler, stop it: an algorithm takes inputs and produces outputs based on those inputs in a deterministic way. Quicksort is an algorithm, even though it *contains* other algorithms, such as those for iterating through a list and comparing elements. Fortran compiler is an algorithm, even though it contains many smaller algorithms.
  • by kdgarris ( 91435 ) on Sunday June 11, 2000 @12:38PM (#1010097) Journal
    This one was widely used back in the 80's:

    10 PRINT "RADIO SHACK SUCKS!!!!!"
    20 GOTO 10
  • Over half of those algorithms are patented by UNISYS! I think they are charging $1000 per use of each algorithm (that's per use, not per application!!)


    --
    "And is the Tao in the DOS for a personal computer?"
  • You forgot the most widely used algorithm in the most widely used OS on the planet.... wait();

    The most important one is sleep(-1). It's tricky to implement, but when used properly, it turns all other algorithms into O(1). NP-Completeness, my ass!
  • That said, I was also shown a video of sorting algorithms in CS class. The only thing that made the video worth watching was that the algorithms made all sorts of weird noises.
    Would that be Sorting Out Sorting, produced on some archaic DEC machine?
  • by Signail11 ( 123143 ) on Sunday June 11, 2000 @10:44AM (#1010101)
    Here's my commentary on the some of the algorithms selected. I've included brief descriptions/references to further information where useful.

    The Metropolis Algorithm for Monte Carlo: Based on the thermodynamic equivalent of crystals forming in cooling solids. As opposed to the naive optimization algorithm, the Metropolis algorithm sometimes allows a step to be taken even when it decreases the objective evaluation function with a probablity equal to the value of an exponentially decaying "entropy" function. _Numerical Recipes_ has a nice description, although their example is quite abysmal. The GNU Scientific Library (sourceware.cygnus.com/gsl) has an implementation of the algorithm, referred to as simulated annealing.
    ...
    The Fortran Optimizing Compiler: compiler techniques in general have improved enormously since the 1950s...this selection doesn't really make much sense to me, as this field has been more about continous and gradual improvement, whether than the developement of some breakthrough optimization techniques. If anything, the pioneering work of Zhang and Roberts establishing the computation infeasibilty of an "universal optimizer" is a much more interesting result.
    QR Algorithm for Computing Eigenvalues: Elegant method for determining the values for which A*x_i=\lambda\_i*x_i for a square matrix A. the eigenvalues and corresponding eigenvectors (the vectors belonging the the nullspace of (A-\lambda\_i*I)) are in a certain sense the natural frequencies of a matrix. Determining the eigenvalues without solving the characteristic polynomial (as with the QR method) is critical for avoiding numerical analysis problems.

    Some notable algorithms which seem to be missing:
    -Sieving techniques for the decomposition of large composites into prime factors
    -RSA/other asymmetric key exchange algorithms
    -Huffman codes, LZ sliding-window compression techniques, arithmetic encoding
    -MDS matrices, error correction codes, cyclic redundancy checks
    -The linear feedback shift register
    -Predictor/corrector methods, adaptive numerical intergration and approximation of ODEs and PDEs
    -Lenstra-Lenstra-Lovasz lattice basis reduction algorithm
    -Berlekamp's Q-matrix algorithm for the factorization of polynomials
  • (sorry if this is a bit off topic)

    What other algorithms would you folks suggest?

    I do a lot of computer graphics work, so the FFT is among my top 10 for that field. Marching cubes, phong shading, environment mapping, bilinear interpolation, BSP & portal PVS are also on top of my list.

    What a perfect nerd question: what's your favorite algorithm, baby?!

    magic

  • Here's another. IIRC you can thank the Simplex algorithm for Linear Programs for Kentucky Fried Chicken.
    In the old days, chicken was expensive. Then Purina (and probably other feed companies) discovered Linear Programming. The chickens have certain nutrient requirements which must be met by some combination of ingredients in the chicken feed. Various ingredients have different amounts of nutrients at prices determined by the current market. Solving the Linear program finds (one of) the cheapest ways to make chicken feed. The feed companies helped set up poultry farmers in a high volume low margin business. Incidentally, solving a Linear Program also gives the cost of each constraint.
    Some of the algorithms I don't recognize, but they all seem to make major (orders of magnitude) differences is what can be computed.
  • Why does everyone speak about their favourite childhood algorithm as an "omission" from the top ten list? If they added your algorithm, it would be a top eleven list. If there were truly an omission, I would expect to see nine algorithms under the heading "top ten".

    "Someone left out error-correcting codes!"
    "Someone left out crypto!"
    "Someone left out the following list of algorithms you learn in the third year of a CS program!"
    "Someone left out LLL!"

    Well, whoop-dee-do. If we added all of your "GLARING" omissions, we'd have a top 100 list. The problem here is that there are too many really good, beautiful, poetic algorithms out there, and no objective way to compare them.

    Someone pointed out that this list was biased towards numerical computation and someone else defended this, saying that they wanted practical algorithms anyways, which is a bit silly, because different people have different practical needs. Some of us really don't have any use for the Fast Fourier Transform. Some of us think that an O(N^log 7) matrix multiplication algorithm is kinda cute, but don't really give a rat's ass about it.

    I suggest that this list does not achieve its goal. Perhaps it publicizes a couple of numerical methods that people might not have heard of (except LLL), but it certainly doesn't live up to the title "Top Ten Algorithms of the Century". I don't think such a list could really exist. If it could, perhaps Donald Knuth wouldn't be spending so much time on that series of books he's so passionate about...

    Donny

  • I'm amazed none of you low-level hardware hacker guys have got this one - of the sorts that have been mentioned (QS, Insertion, Merge, Selection etc) - Bubble is the only one which doesn't need extra memory. All the others need a target buffer, while Bubble works in place.

    Or am I wrong? I'm not sure....please feel free to correct.
  • Firstly, the heap sort. When I was learning about computers I studied sorting methods. The idea that inputting your random data into a linked list allowed you to just read it out in the right order was a revelation.
    The next important algorithm that I came across but unfortunately can't describe properly here is used to produce molecular shapes from X-ray diffraction patterns. This helped mankind to understand DNA.
    My third most important algorithm is the state engine that drives the TCP/IP stack. This is whats bringing my burblings to you now.
    I'm happy with quite a few of the algorithms in the top 10 list - fourier transforms in particular but some of them seem rather esoteric.
  • I was disappointed by this list - I thought it was too narrow, too shallow, and overly biased towards numerical algorithms. For a start, I would cliam for number one algorithm of the century the Turing Machine algorithm [umn.edu] (Turing, 1936) which made symbolic computation (and thus conputers) possible in the first place.

    I was also disappointed not to see a symbolic logic algorithm such as the Resolution Theorem Prover [mit.edu] (Robinson, 1965) or Circumscription (McCarthy [umass.edu], 1980). I'd like to have been able to point to something of Jon Barwise [indiana.edu]' Situation Semantics, but couldn't think of a specific algorithm to highlight.

  • What about good old Huffman coding? Seems like all compression algorithms came from that first insight. Even better, it perfectly illustrates the entropy measure of information.
  • By definition, an algorithm terminates.
  • by delmoi ( 26744 ) on Sunday June 11, 2000 @10:15AM (#1010121) Homepage
    But, not that interesting. Its just a list, no details. I suppose they figure we already know this stuff is, I sure don't (other then the FFT, quicksort, and optimizing compiler). I would have liked to have seen a more indepth description of what each algorithem did and what its impact had actualy been...
  • Double the sales? If people only followed the directions they should repeat until they run out of shampoo.It's an infinate loop!

    Meanwhile, we can take over the world?

  • by Eric Green ( 627 ) on Sunday June 11, 2000 @05:21PM (#1010125) Homepage
    RSA public key encryption has to be the #1 most influential algorithm in the Internet age. Yeah, the DoD junkies care more about computing missile trajectories (one of their algorithms on their "top 10" list, regarding celestial bodies, is applicable towards that), but without RSA public key encryption, e-commerce would not exist. At the time that e-commerce was first envisioned, RSA public key encryption was the *ONLY* real public key encryption algorithm (elliptic curve encryption has now entered the picture), and it is at the heart of the SSL (Secure Socket Layer) encryption used to secure e-commerce.

    And finally, DES *has* to at least be an "honorable mention". DES, despite being dated on the day it was created with its 56-bit keys, was the first publically-implemented Feistel-class symmetric cipher, and brought S-boxes and other encryption techniques now commonplace out of the world of spooks and into common parlance. All ciphers since DES have been either inspired by DES or have been a reaction to what people view as flaws in DES. If that isn't "influential", I don't know what is!

    -E

  • You forgot the most widely used algorithm in the most widely used OS on the planet....

    wait();

  • I haven't followed the link, but I hope the winner was:

    1. Turn on TV
    2. Watch Until bored
    3. Change channel
    4. Go back to 2
  • Berlekamp's algorithm is useful for syndrome decoding, certain applied aspects of communication error detection protocols, and is closely related to the far more useful Berlekamp-Massey algorithm for determining the linear complexity of a bit sequence (used in Reed-Solomon ECCs). Besides, most cryptographic applications are out of reach of Berlekamp's original algorithm; one would need to use Shoup or the Hogland-Warfield probabilistic algorithms for that.
  • I can see how all of these got in here, but I would expect that the greatest algorithms of all time would last for a bit longer than that. Its not as if the fortran compiler reached its pinnacle of evolution in 1957 and has remained relatively unchanged since.

    Other entries like the matrix computation methods, fast fourier transform, etc are still in wide use today because we haven't found (or there aren't) any better algorithms we can use.

    Maybe its just me...

    --
    Eric is chisled like a Greek Godess

  • Dictionary of Algorithms, Data Structures, and Problems [nist.gov] is a pretty good "dictionary" of algorithms. Another good place, if you know the name of the algorithm, is of course Google [google.com]...
  • by grappler ( 14976 ) on Sunday June 11, 2000 @10:51AM (#1010133) Homepage
    Only one of these was developed in my lifetime.

    On an unrelated note - I only recognize three of the ones on their list. I would have liked to see a description of what each one does, what impact it had, and perhaps how it works. That would make for interesting reading.

    --
    grappler
  • FFTs have been used for everything. Someone once said -- find a way to turn an O(n^2) alg to O(n lg n) time, and they will beat a path to your door finding uses for it.

    Also, it we should mention (alhough I've been skimming previous comments, so maybe redundant) that Gauss used FFT to compute fourier series -- he didn't make a big deal out of it, as he no doubt conscidered it a straight forward application of base principles. Smart man that.
  • by harshaw ( 3140 ) on Sunday June 11, 2000 @10:55AM (#1010137)
    hmm...

    Don't forget Diffie-Hellman(sp?) key Exchange, DES, etc.

    These crypto algorithms pretty much made secure e-commerce of the 90's possible.
  • I ported that algotrithm to the UK and "Laskys", our local equivalent of Electrode Hut.

    Laskys, being a bit more up-market, sold HP-85s with built-in thermal printers. If you used Basic's normal "PRINT" function not "DISPLAY" (Rocky Mountain Basic!), then you could spew out rolls of thermal paper. Friends would compete for the shortest marquee display programs that would print large letters sideways onto this stuff, and could hopefully be typed in and run before the shop's sellrats could turf us off.

    Not much of a BREAK key on a HP-85, as I remember...

  • Where's the Kalman filter ? We couldn't have got to the Moon without it.

    Equally, we'd never have ICBMs and an arms race without it, but when did morality ever stop geeks inventing stuff ?

  • They forgot things like hash tables, binary trees, and so on. Of these, they only mention quicksort.

    As in everyday algorithms used all the time that actually make commonplace software work well.

    I'm surprised that data compression hasn't been mentioned, nor public key cryptography.

    What about the algorithms used in networking? Surely TCP/IP has a greater impact than some obscure matrix multiplication.
  • CSS encryption scheme for the smartest implementation of the most secure algorithm in the entire history of CS!
  • You should try to get in contact with the university library. This one is an IEEE publication and either they have it themselves or they can arrange things through another library for you.

    I looked a bit around in my region and I can get articles either via the local university library for about 10 cent copy fee per xeroxed page or delivered via email as TIFF files from the JASON [uni-bielefeld.de] server for about $1.50 per article. This link [uni-bielefeld.de] will send a test article to your email address. You then need an uudecode utility to turn the attachment into a binary und the result (a selfexploding ARJ for DOS) can be unpacked with the unarj utility that is available for UNIX too.

    I believe you should find similiar offers.

  • The inspiration for the "random sort" that you talk about _must_ be the "bogo-sort" entry in The Jargon File: http://www.tuxedo.org/~es r/jargon/html/entry/bogo-sort.html [tuxedo.org]
  • I think you miss the point... the list was for the most influential algorithms in science and engineering. My guess is that they mean classical science and engineering rather than computer science/engineering. IMO, there should always be a distinction between these fields. Public-key cryprography serves no purpose in the classical sciences (other than protecting your sensitive data on radioactive isotopes, etc. :)

    Eric
  • compilers perform a series of iterative steps and it completes. an algorithm.
  • 1946: The Metropolis Algorithm for Monte Carlo is historically important and I can appreciate such great algorithms as The Decompositional Approach to Matrix Computations, QR Algorithm for Computing Eigenvalues and Quicksort Algorithms for Sorting.
    However I don't think 10 algorithms is enough to describe the greatest achievements of computer science. Binary search, Binary Tree Manipulations, Data Structure algorithms such as Hashing, BTrees, Heap Manipulations, Queues, Stacks, various forms of Linked Lists. Searching algorithms, heuristic bound searching algorithms such as Chess game algorithms, Dijkstra, Travelling Salesman (unsolved). Recursive algorithms. etc...
    The list is by far not exhausted but the only 10 algorithms listed are defenetely not the top priority algorithms (except quick sort)
  • by headLITE ( 171240 ) on Sunday June 11, 2000 @10:56AM (#1010161)
    Well there wasnt much of the CS we know in the 1950es.

    As for that FORTRAN thing, I guess that was mentioned because it was the *first* optimising compiler, ever, and it served as an example to every compiler that was made ever since. Talk about influence now...

    Oh and of course do I love Binary Space Partioning. Thats the most important algorithm ever. (so what, DOOM is old, but its still fun and at least it has an atmosphere...)

  • There's a great paper about pessimal algorihtms. These two guys from DEC look at slowsort and reluctant search. The idea is that they provably terminate, but take the longest time they can to do so, without ever wasting a computation.
  • I don't know much about the Fortran algorithm, but since Fortran was the first compiled language, I imagine the ideas about compilation and optimization were pretty revolutionary at the time. We may have better ways of doing now-a-days, but it got the whole ball of wax rolling.

    Dana
  • I believe the Fortran compiler was the first fully functional (non prototypical) optimizing compiler. It's not really the language we're interested in anyway because most compilers look the same once you get past the token recognition stage. It's the fact that the Fortran compiler was the first to implement modern compiler theory that makes it special.
  • Are any of these algorithms related to things like compression algorithms (MP3, Zip, LZW, etc.),[...]
    Not really. [...]

    I'm pretty sure I saw an FFT when I was looking at the ISO reference MP3 encoder source (or maybe that was LAME, I forget). I think MPEG1, and JPEG do as well. I don't know if MPEG2 does. None of them have much to do with LZW...except the optimising FORTRAN compiler being the first step away from assembly for "serious work" did allow later languages like C...of corse if it hadn't been FORTRAN, it would have been something else.

    I susspect anytime you see a spectrum eq, or a spectrum band display there is a FFT, so even if MP3 doesn't have one, WinAmp, and the WinAmp-like things for Unix probbably have them.

    It is a shame they didn't list the Kolman Predictor. Great for turning messy Nth order control problems into, er, messy less then Nth order problems :-) It is also why a cheap ass servo in a $100 CD player can be positioned as well as the expensave steppers in the orignal $900 CD players (it really is a big cunk of hte cost reduction there). It also does intresting stock predictions for some trading firms. Oh, and it works great to figure out where to shoot a flamethrower too.

  • It's an infinite loop if you read it like a moron.

    Somebody should have stopped at half a cup eh? It's only a joke!

  • A compiler takes a set of input (source code) and produces an output (machine code) based on an ordered series of steps.

    I dunno, sounds a lot like an algorithm to me...

  • Somebody mentioned Error Correction algorithms, which brought to mind one of the few algorithms I've ever heard of that really hasn't been superceded since its debut: The Reed-Solomon Error Correction system.

    My memory of it is a bit shaky(I studied it somewhat a few years back when I picked up a copy of Digital Audio Processing), but it essentially specifies an method by which an arbitrary number of bits in any block can be in error and the algorithm itself will A) Detect that error in the bitstream and B) Correct that error, if the number of misread bits is within the specifiable boundries of the algorithm. So you have situations where merely n extra bytes appending to an m byte block will automatically correct for some amount of error *anywhere* within that block.

    Also a nice touch is that one can easily specify a specific number of bits which may have identical values--it turns out that CD's cannot write more than eleven 0's in a row--the head loses the ability to follow the "groove". No problem at all for this code.

    This is essentially <i>the</i> system by which arbitrary-quality error correction is implemented. I'm not saying that the selection of algorithms was in error--but Reed Solomon does deserve some kind of honorary mention.

    Oh well, at least it gets a better treatment than the Discrete Cosine Transform, which (literally) was <i>rejected by the NSF because it was considered too simple.</i> The DCT is at the core of both JPEGs(it compresses into a quantizable domain 8x8 blocks of YUV domain pixels) and MP3's(compresses into a quantizable domain each of 32 frequency subbands into 18 frequency coeffecients).

    Yours Truly,

    Dan Kaminsky
    DoxPara Research
    http://www.doxpara.com
  • by DanaL ( 66515 ) on Sunday June 11, 2000 @11:06AM (#1010190)
    BUBBLE SORT!!!!!!

    Seriously, it's first sort most of us learn, and it has the cutest name.

    One of the profs who teaches an algorithms course at my university always shows his class a program that displays performance graphs as various sorts chew through large data sets. Invariably, most of the students cheer for bubble sort, even though we know it will loose. Favouring the under-dog, I guess :)

    Dana
  • I normally wouldn't bother to reply, but most people are missing that the list is ordered by date, not some arbitrary "importance" ranking!
  • by quonsar ( 61695 ) on Sunday June 11, 2000 @11:09AM (#1010195) Homepage

    You forgot the most widely used algorithm on the planet....

    10 IN
    20 OUT
    30 GOTO 10

    ======
    "Rex unto my cleeb, and thou shalt have everlasting blort." - Zorp 3:16

  • by werdna ( 39029 ) on Sunday June 11, 2000 @11:41AM (#1010196) Journal
    The following algorithm, though peculiarly simple, was single-handedly responsible for DOUBLING sales of shampoo:

    1. Lather
    2. Rinse
    3. Repeat

    I seem to recall that Nolan Bushnell did a similar thing in the instructions for Pong. Does anyone recall what those were?
  • by Woodblock ( 22276 ) on Sunday June 11, 2000 @11:41AM (#1010197) Homepage
    What the hell? How can they possibly leave off the most influential
    algorithm in modern times:
    Amazon's One-Click shoppint.

    Seriously, I guess this list shows the value of unpatented computer
    algorithms. Any of these can be reimplemented, and are actually
    innovative, while dispensing software patents to such silly ideas as
    saving credit card info.
  • Bubble sort actually kicks butt for mostly-sorted lists. If you have a list which you know is almost in order but couldn't be bothered to apply one of the better general algorithms to it, use bubble sort.

    I saw a code example in x86 assembler once that implemented bubble sort. It was written by one of the guys who did Tomb Raider - I believe it's buried somewhere in tr.exe and actually does something useful.

  • And why would you leave out Diffie-Hellman
    asymmetrical key encryption algorithms?
    Any hope you have of privacy now or in the
    near future is going to be based on this.
  • Didn't the person who wrote than go on to found Computer Sciences Corporation (CSC)?

    Sheesh...now just another glorified IT company.
  • To me, it's a glaring ommission to not have any string matching algorithms. In other words, where are the regular expressions?!?

    Two reasons:

    • It's a engenering list, so if it doesn't directly help build bridges or blow them up, it ain't eliagable
    • Regular Expresions are a intresting bit of math that predates the computer. A lot of NFA/DFA work was done around the turn of the centrey. I think much of it before the turn of the centery (i.e. late 1800s!).

    Which I found pretty bizzare, since I'm not aware of any good non-computer use of DFAs and NFAs, but I'm not much for non-computer things anyway...

    Note, if you poke a bit farther in the dragon book there is a algo to construct a DFA directly from a regexp, without making the NFA first. I think the NFA->DFA also makes a more compact DFA though (and no, I don't recall the name either, and I just coded one up last year).

    Also note, the dragon book is defintly not the end-all source of NFA/DFA/regex knolage. There is another book of Aho (and I think Ullman) that talks about more things, like it being an intractable problem to find the minimal regex to match a given set of symbols (exponantial in both space and time).

  • by hey! ( 33014 ) on Monday June 12, 2000 @05:55AM (#1010205) Homepage Journal
    Quicksort is elegantly small, great for demonstrating the power of recursive algorithms in a comp sci class, and performs very well in the average case.

    However,it has a really bad worst case, which happens to be fairly common: when your data to be sorted is already more or less in order. Then it performs about the same as bubble sort [O(N^2)]. It also uses O(n) levels of recursion in the worst case, which may cause stack overflow in some cases.

    For that reason I personally wouldn't put it in a library routine, because some programmer will eventually use it to sort an array of keys, collect a few more key, and resort the array. I'd only use it where I was fairly sure I was collecting keys whose order had considerable entropy.

    Quicksort may be influential because of its simplicity (I believe it is even easier to understand than bubble sort) and widely used because it is easy to code, but it is hardly the best all around sorting algorithm.

    My personal favorite sorting alogorithm is Shell sort; this is not to say it is the best algorithm in existence, but it is easy to code and performs just a tad slower than Quicksort in the average case. Most importantly the spread between worst case and best case is small. It is also well suited for internal sorting. The only drawback to this algorithm is that it is not stable, that is to say that two keys of equal value will sometimes end up in different relative order to when they started. This is easy to fix, however.

    Quicksort works because it gets rid of large amounts of entropy fast. Where entropy is low (e.g. the keys are already ordered), this advantage is neutralized. It is possible to code hybrid algorithms that use Quicksort at the outset to divide the key file into a number of smaller sets, and then use a different algorithm when the key files become relatively small. I've heard that some people combine Quicksort with Linear Insertion sort when the keyfile get to be ten keys or less, because while Linear Insertion Sort is generally slow [O(N^2)], its simplicity makes it faster on very small key files. Perhaps starting with Quicksort and switching to Shellsort or some other algorithm for some key file size (perhaps 100 or so) would result in a large decrease in worst case times and similar performance in the best case.
  • The Viterbi Algorithm is a fundamental building block for digital communications over noise-limited channels. Most of you probably use it every day:
    • All telephone modems over 9600 baud
    • Hard disk read channel on modern hard disks
    • All digital cellular phones
    • Digital satellite TV receivers
    • Microwave line-of-sight links (between phone exchanges)
    • GPS receivers
    • ADSL modems
    • Cable modems
    • and much more

    This algorithm is used for efficient decoding of convolutional error correction codes and for untangling signals corrupted by multipath dispersion. It has been developed by Andrew J. Viterbi (founder of Qualcomm [qualcomm.com]) while at NASA for receiving the ultra-faint signals transmitted by deep space probes.



    An introduction to the Viterbi Algorithm [netcom.com]



    Technically speaking, it is an efficient implementation of the Maximum Likelyhood Sequence (MLS) detector, just like the FFT is an efficient implementation of the discrete fourier transform.

    ----

  • Anyone wishing he was there? In 1996 trying out one of his/her first algorythms on an Eniac or similar bigmomma computer, when assembler would be considered a high-level programming language? Being a pioneer, having your name mentioned more like an ubiquitous piece of standard code or part of a CPU ("the ALU is usually connected to the Haggar through a FIFO...") :o)

    OTOH (OT): I am entertaining the idea of making a little FORTH compiler that would be also some sort of OS. Anyone with some good references/resources that coud help (ok, I know, read the Linux kernel source...)

  • by mvw ( 2916 ) on Sunday June 11, 2000 @02:14PM (#1010221) Journal
    Their list of 10 algorithms having 'the greatest influence on the development and practice of science and engineering in the 20th century' appears in the January/February issue of Computing in Science & Engineering.Their list of 10 algorithms having 'the greatest influence on the development and practice of science and engineering in the 20th century' appears in the January/February issue of Computing in Science & Engineering.

    Rather use this link [computer.org] - it has the missing explanations.

    And this link [sciencenews.org] explains integer relations.

  • That's an OK list I suppose, except...

    FFT should be much higher, it is so very very useful and important.

    The Fortran compiler??? Is this a joke?

    Where are the compression algorithms? LZW, Wavelets, etc. Where are the error correction algorithms?!

    Compression, error correction, and the FFT are like the holy trinity of modern computing. Everything from CD's DVD's and spacecraft orbiting Mars and Jupiter use compression and error correction. How could they deny the importance of sheer power of these algorithms?!

  • I'm glad you said this. It makes me feel better about the list. Still I wonder what Emmett was smoking when he wrote (on the main page):

    "If you use a computer, some of these algorithms are no doubt crunching your data as you read this.

    Really?? I do my share of fancy math, though I'm not in the Big Leagues, and I'd be surprised if and of these algorithms are crunching *my* numbers (except maybe in some NSA basement supercomputer, where they are just now learning that my SSN + selected other ID numbers translates into the punch line of a dirty in EBCDIC)
  • I can see how all of these got in here, but I would expect that the greatest algorithms of all time would last for a bit longer than that. Its not as if the fortran compiler reached its pinnacle of evolution in 1957 and has remained relatively unchanged since.

    I'm guessing that the algorithm for optimizing Fortran that was developed in 1957 paved the way for current optimizing compilers, and would bet that many of them use the same techniques that this one did.

    Of course, I know less than nothing about compiler technology, so... :-) Good to see that FFT made the list, though. That's a pretty useful algorithm, to say the least.
    --
  • Because I seem to have implemented all the first 8 algorithms before, either in CS class or in the course of working as a research intern - all except the Fortran compiler. I find it silly that a particulat language compiler is considered to be a top ten algorithm. I have no quarrels with the choices except for this one. Seems to me that the list is heavily biased towards scientific and numerical applications, and therefore the Fortran compiler. What about operation research and say (as an example) algorithms for solving the Travelling Salesman Problem? Or what about innovative (eeks - hate that word) data-structures such as, say, kd-trees that facilitate and suggest the development of a whole class of extremely fast algorithms in geometry?
  • "Great algorithms are the poetry of computation"

    Funny, I thought Perl was :-)

    (note to moderators: it's a joke, ok?)
    nuclear cia fbi spy password code encrypt president bomb
  • 1) 2000: The Microsoft End User Licence.
    Turns network admins into Linux users.
    ___
  • Damn. I can see the bubble sort program I wrote in the 5th grade didn't make it ... :)

  • by orpheus ( 14534 ) on Sunday June 11, 2000 @11:25AM (#1010257)
    I really don't care for their choices at all. A lot of them are more like general approaches than algorthms, and I'm not at all sure they are the most influential. I think they are supposed to be "the cleverest of the common fancy methods"

    Simple algorithms for common problems are much more widely used, and have far more impact and influence, but try telling *them* that!

    I hope these links help. (Warning: many are technical) If anyone has personal favorites that are less dry than many of these, please post!.

    10. 1987: Fast Multipole Method. A breakthrough in dealing with the complexity of n-body calculations, applied in problems ranging from celestial mechanics to protein folding. [Overview [washington.edu]] [A math/visual approach [duke.edu]]

    9. 1977: Integer Relation Detection. A fast method for spotting simple equations satisfied by collections of seemingly unrelated numbers. [Nice article with links [maa.org]]

    8. 1965: Fast Fourier Transform. Perhaps the most ubiquitous algorithm in use today, it breaks down waveforms (like sound) into periodic components. Everyone knows this one (or should) [Part II of my personal favorite FFT and wavelet tutorial [iastate.edu]]

    7. 1962: Quicksort Algorithms for Sorting. For the efficient handling of large databases. [Definition [nist.gov]][Basic Method [princeton.edu]][Mathworld [wolfram.com]][More technical explanation [syr.edu]][A lecture with animations and simulations [sunysb.edu]]

    6. 1959: QR Algorithm for Computing Eigenvalues. Another crucial matrix operation made swift and practical. [Math [unc.edu]] [Algorithm [berkeley.edu]

    5. 1957: The Fortran Optimizing Compiler. Turns high-level code into efficient computer-readable code. (pretty much self-explanatory) [History and lots of info [unc.edu]]

    4. 1951: The Decompositional Approach to Matrix Computations. A suite of techniques for numerical linear algebra. [matrix decomposition theorem [wolfram.com]] [Strategies [syr.edu]]

    3. 1950: Krylov Subspace Iteration Method. A technique for rapidly solving the linear equations that abound in scientific computation. [History [utk.edu]] [various Krylov subspace iterative methods]

    2. 1947: Simplex Method for Linear Programming. An elegant solution to a common problem in planning and decision-making. [English [anl.gov]} [Explanation with Java simulator [anl.gov]] [An interactive teaching tool [hofstra.edu]

    1. 1946: The Metropolis Algorithm for Monte Carlo. Through the use of random processes, this algorithm offers an efficient way to stumble toward answers to problems that are too complicated to solve exactly. [English [ic.ac.uk]] [Code [syr.edu] and Math [syr.edu]] [Math explained [ic.ac.uk]]
  • I love Monte-Carlo simulations - it's the only way to do research ;-)

    And it nicely adapts to implementation across Beowulf clusters.

    Now you know why we asteroid Catastrophists talk in probablilities all the time - we just play games with dropping random asteoids onto the Earth and call it work.
  • Public-key cryptography is certainly one of the most significant information-technology discoveries of this century. It's been described as the most significant breakthrough in codemaking and -breaking of the past 2000 years.

    But I'm guessing it's not on the list because it's not an algorithm. It's a lack of an algorithm. It's simple enough to multiply two large primes together. The reason public-key crypto works is that there is no known algorithm to turn the product back into its constituent primes in reasonable time.

    Maybe that missing algorithm should have been listed as #0...

    Jamie McCarthy

  • List is definitely biased towards that more numerical side of CS. I would personally like to see search algorithms, graph algorithms, advanced data structures, etc. Here is an assortment of more realistic CS stuff (in no specific order), I tried to put in stuff not talked about in previous posts.

    1. A^* search and variants. Simulated annealing, heuristic repair, etc. You know, GOFAI search algorithms.

    2. Minimum Spanning Tree algorithms. Shortest path algorithms. Graph partitioning algorithms.

    3. Data structures: heap (the usual one, and the variants such as binomial fibonacci), tree structures(bst, r-b, avl, b+, bsp, quad-, oct- etc.), disjoint sets, hash tables.

    4. I don't wanna admit it, but perhaps back propagation algorithm for ANN :)

    5. LR(1) parsing algorithms, without which you can't write decent compilers!

    6. Round-Robin scheduling algorithm. Where're you heading without pre-emptive multitasking? :)

    7. Median-of-medians. Power of recursion!

    8. LCS (Longest common subsequence), demonstrative of dynamic programming. And you used some diff program surely...

    9. RSA and similar public key crypto-algos.

    Hope you like this slightly alternative supplement ;)
  • Granted, that was a list of many fine algorithm discoveries, but is it a correct list? The many notable omissions that slashdotters here are bringing up (my personal fave that didn't make the list: Reed Solomon encoding) would indicate not.

    Moreover, didn't all the "Top Ten" lists from other fields come and go last December?

    So how good can the best computer algorithms be, when it still takes them 6 months to generate a "Top Ten" list, and the list is incorrect?
  • Before I see one more complaint about "why isn't algorithm X on the list", please note that these are supposed to be the algorithms of greatest influence in SCIENCE and ENGINEERING. Great CS algorithms don't necessarily make the cut (because they aren't used as much in the development and practice of science and engineering).

    Also note the GREATEST INFLUENCE part; recent algorithms or clever algorithms don't necessarily make the cut (because influential algorithms usually need time to build up influence, and sometimes the most influential ones are the simplest and not always the most clever).

  • Without them, Quake is impossible. :)

  • Huffman encoding is exactly the same thing as Shannon-Fano encoding, which predates it by at least 10 years.

    No, it isn't. Huffman generates optimal codes. Shannon-Fano does not. Yes, Huffman is esentially a "tweak" of Shannon-Fano (the code generation is done in the reverse order), but it's a very powerful tweak that turns a "pretty good" algorthm into an "optimal" algorithm. (given certain constraints. I know arithmetric encoding is better in many ways.)

    Bresenham's line drawing algorithm is nothing but solving the equation of a line, only with repeated subtractions instead of a division. (Faster on processors with slow floating point.) Including this would be ridicolous.

    All algorithms are nothing but performing a bunch of simple computations to achive some bigger result. Bresenham's algorithms have become "staples" of computer graphics though.

    Incidently, Bresenham's algorithm is useful even on machines that don't have slow floating point. Bresenham produces "nice looking" lines. Simple linear interpolation produces ugly lines. If you don't know what I mean, try drawing some diagonal lines in GIMP using the pencil and a 1x1 brush. See how ugly they are? That's because GIMP uses linear interpolation, rather than Bresenham. I hate to say it, but even MS Paint produces nicer looking lines, simply because it uses Bresenham. (and if you think how it looks isn't important... sorry, but that's the way it is in graphics...)

    I think you mean Binary Splay Partition algorithms, which truly also are in widespread use. But in such a list it would be better to include some of the real raytracing algorithms.

    No, I meant Binary Space Partitioning trees aka BSP trees, which are used for all sorts of things in graphics including polyhedral rendering systems, raytracing optimization, polyhedral set operations, and collision detection. Yes, there are other techniques for achieving all of these, but BSP trees show amazing versatility.

    Incidently, I've never heard of "Binary Splay Partition algorithms". Are you talking about binary splay trees?

    KMP search is hardly in use except in very special cases. There are more efficient algorithms around.

    Yes, I made a mistake. KMP and Boyer-Moore are very similar (in much the same way that Shannon-Fano an Huffman are similar, actually...), but Boyer-Moore is clearly the superior algorithm. I think Boyer-Moore and KMP have become intertwined in my head, because I learned both of them at the same time. (perhaps also because "the BM algorithm" has nasty connotations...)

    Okay, strike my vote for KMP, and add one more vote for Boyer-Moore...

    In any case, the list I posted was not intended to be "the 9 best algorthms of the 20th century". It was more like "the 9 best non-numerical algorithms that came to me as I was posting to Slashdot". So take it with a grain of salt. I know I missed some great ones in there... I didn't have any tree balancing algorithms in the list, for example. Skip-lists are also pretty cool, and probably deserve a mention, despite the fact that almost no-one seems to use them. Systems algorithms like process scheduling algorithms, caching algorithms, and even the elevator algorithm (used for scheduling requests to "seekable" devices like hard drives) probably deserve a mention too. And how about database algorithms, like the various join algorithms? Or the numerous AI algorithms? A "top 10" list is really too short. We really need a "top 100" list...
  • The Fortran Optimizing Compiler?!?! I don't consider that an algorithm, because... well it doesn't FEEL like an algorithm! Look, it just isn't an algorithm, ok?

    *sigh* I've come up with a reason why. It's because it's machine dependent - an i386 Fortran compiler and a PDP11 compiler are completely different.
  • by Metzli ( 184903 ) on Sunday June 11, 2000 @10:31AM (#1010278)

    Why are you surprised that the list is "heavily biased towards scientific and numerical applications"? The list comprises the algorithms that they believe have had 'the greatest influence on the development and practice of science and engineering in the 20th century.' I would _expect_ this list to be almost exclusively biased towards "scientific and numerical applications."

    I'm not surprised that the Fortran compiler is included. Like it or not, Fortran is the grand-daddy of high-level languages for science and numerical apps.

  • No. The asymtoptic speedup is still present; adding matrices is easy, but multiplying them is significantly harder. The method of matrix multiplication described recursively reduces the sizes of the matrices to be multiplied at the expense of more adds. For dimensions over 2000 or so on a modern computer, the improved method wins out.
  • It seems to me that Dijkstra's algorithm should be on there. It's certainly as important as quick-sort...
  • by Hrunting ( 2191 ) on Sunday June 11, 2000 @10:33AM (#1010288) Homepage
    Does anyone care to translate these algorithms to something us lesser geeks might know? I recognize the quicksort, Fourier, and Fortran algorithms, but the rest are rather over my head.

    It'd be interesting to see how these relate to the algorithms that computer jockeys see as king (obviously, the list is biased towards science and numbers). Are any of these algorithms related to things like compression algorithms (MP3, Zip, LZW, etc.), encryption algorithms (DES, RSA, Blowfish, etc.) or search algorithms (like the one Google uses, for instance)?

    What would a computer geek's top ten algorithms list look like? Would they be scientifically/technologically important or more socially important (like MP3, RSA, or Google)?
  • This algorithm decreases the amount of time used to multiply to matricies from O(N^3) to around O(N^2.72) (or less) by breaking the matrix up into little pieces that are multiplied by each other (or broken into smaller pieces with the same process) then added together to get the result.

    The primary speedup in this algorithm is that additions are much faster for a computer to complete than multiplications, and a single multiplication can be replaced with several additions.

    The other benefit of this algorithm is that it divides the work into subproblems that can be dispatched to alternate processors and then recombined centrally.

    Unfortunately, due to the large overhead costs of this method it only works well on very large data sets, such as 1000x1000 or larger arrays. Once you get below this size it is best to use the standard matrix multiplication.

    Here is a section of code from a matrix multiplication algorithm using this method I wrote for an algorithm analysis class: if (this.length == 2) /* 2x2 matrix */
    {
    int x1, x2, x3, x4, x5, x6, x7;
    int x1a, x1b, x6a, x6b, x7a, x7b, x2a, x3b, x4b, x5a;

    x1a = a11.add(a22).getElement(0,0);
    x1b = b11.add(b22).getElement(0,0);
    x1 = x1a * x1b;

    x2a = a21.add(a22).getElement(0,0);
    x2 = x2a * b11.getElement(0,0);

    x3b = b12.sub(b22).getElement(0,0);
    x3 = a11.getElement(0,0) * x3b;

    x4b = b21.sub(b11).getElement(0,0);
    x4 = a22.getElement(0,0) * x4b;

    x5a = a11.add(a12).getElement(0,0);
    x5 = x5a * b22.getElement(0,0);

    x6a = a21.sub(a11).getElement(0,0);
    x6b = b11.add(b12).getElement(0,0);
    x6 = x6a * x6b;

    x7a = a12.sub(a22).getElement(0,0);
    x7b = b21.add(b22).getElement(0,0);
    x7 = x7a * x7b;

    c11 = new Matrix(1);
    c12 = new Matrix(1);
    c21 = new Matrix(1);
    c22 = new Matrix(1);

    c11.setElement(0, 0, x1 + x4 - x5 + x7);
    c12.setElement(0, 0, x3 + x5);
    c21.setElement(0, 0, x2 + x4);
    c22.setElement(0, 0, x1 + x3 - x2 + x6);

    }


    --
    Eric is chisled like a Greek Godess

  • The other advantage of FFT it can be perfomed in the memory consumed by the data set if you don't care about overwriting your data. This is extremely important for DSP implementations where memory is constrained.

    Interestingly enough, the so called "butterfly" factorization of the transform matrix which is at the heart of the FFT was discovered by Gauss, but never used in the context of Fourier analysis. It was rediscovered independently.
    --
  • I plan to make a bunch of money on the recursive version of this algorithm:

    useProduct(&productLeft) {
    If (!productLeft) {
    buymoreProduct(productLeft);
    return;
    }
    lather();
    rinse();
    useProduct(productLeft);
    }

    (c) 2000 AP, Protected Under US Patent#: (patent pending)
  • Given that the focus of this list is Computational Algorithms (rather than Analytical Algorithms), I believe some implicit biases must exist: The
    algorithm must be elegant to implement and efficient to run on a computer.

    But beyond that, there are algorithms that "mediate" between our world and the world of the computer. Chief among these is the most fundamental problem of Computer Graphics: What is the best way to draw a line or a curve onto a rectilinear array of pixels?

    The problem is VERY non-trivial! "Good" lines must look the same if drawn from A to B as they do when drawn from B to A. Many early line drawing
    programs failed miserable at this task, forcing drawing programs to sort the line endpoints before calling the drawing algorithm.

    Bresenham's Line Algorithm was the first to meet these constraints (when properly implemented, of course) in a manner that was also EXTREMELY
    computationally efficient. Not only did this algorithm give you "good" lines, it also gave you "fast" lines!

    The problem gets worse with curves: In addition to directional symmetry, they must also exhibit reflective (not rotational) symmetry. That is, an arc drawn from 1 o'clock to 3 o'clock must be identical to all of its mirror reflections: The horizontal reflection from 9 o'clock to 11 o'clock, the vertical reflection from 3 o'clock to 5 o'clock, and the combined reflection from 7 o'clock to 9 o'clock. Deviations from such symmetry are extremely noticeable to the human eye.

    For the generalized ellipse (which trivially includes the circle), an algorithm that meets the above requirements will have an extremely valuable
    secondary effect: The algorithm need only be used, at most, to draw just one quadrant of any closed ellipse, since the other three quadrants can be drawn by reflecting the original arc. For a circle, only a single OCTANT (1/8 of a circle) is needed!

    While other circle, arc and ellipse algorithms preceded Bresenham's, again none was simpler, more accurate or faster.

    The underlying problem has to do with errors: When is it appropriate to turn on this pixel instead of that one? The "easy" result is to work in floating point, with fractional pixels, then round to 1 or 0 to decide if a specific pixel is on or off. And this, in turn, would seem to demand the use of trigonometry, in the case of a circle! Clearly, doing floating point and trig just to draw a line or circle is impractical, yet that is exactly what many of the earliest hardware accelerators did!

    Bresenham's algorithms managed to draw accurate and precise lines and circular arcs, accurate to within half a pixel, without floating point math.
    Now that's what I call one hell of an algorithm.

    More complex curves, including arbitrary curves, lacked an "adequate" solution until the advent of the B-spline and NURBS techniques. While extremely efficient, they resemble elephants compared to the cheetah of Bresenham's.

    And, of course, Bresenham's algorithms don't generalize well to meet the contemporary need for things such as alpha blending and dithering in the
    color space. But the best of such algorithms still have a bit of Bresenham at their core.

    A very good summary of the derivation of Bresenham's Line Algorithm may be found here [ohio-state.edu].

    Bresenham's Algorithms (along with many other graphic algorithms such as flood and fill, and "XOR" cursor drawing) are fundamentally what made
    "Personal Computing", that is, interactive computing with a GUI, possible with inexpensive hardware.

    Bresenham's Algorithms (especially the Line Algorithm) are a glaring omission from the list.
  • While it's true that many of these algorithms are the fastest possible ways of doing things (aside from minor tweaks on a case-by-case basis), it is wrong to say that this will always be the case. I know this seems oxymoronic, but it makes sense if you consider that our methods of computing may soon undergo fundamental changes. Quantum computing has produced algorithms which are abstractly faster than their conventional counterparts, so much so that even a 1 MHz quantum processor may be several orders of magnitude faster than a 1 GHz conventional processor on these specialized tasks, so far including factoring huge numbers (encryption) and extremely fast data searching. I think that quantum computing will fundamentally change the way we look at high performance algorithms, because it requires much more specialization at the hardware level. Any thoughts on this?
  • by crow ( 16139 ) on Sunday June 11, 2000 @10:37AM (#1010313) Homepage Journal
    I think they left out two important categories of algorithms. They should have had at least on data compression algorithm (LZW, wavlet, or something like that) and an encryption algorithm (DES or RSA).
  • Retro seems to be exactly what I'm looking for, hoping that the learning curve is not too steep. (I see he went from assembler to Forth pretty quickly in his rewrite... I am interested in the part where you have the core in assembler that will interpret Forth).

    Thank you very much for the link!

  • Bubble sort actually kicks butt for mostly-sorted lists. If you have a list which you

    know is almost in order but couldn't be bothered to apply one of the better general algorithms to it, use bubble sort.

    Bubble sort is better than a general-purpose sort on a mostly-sorted list, so, if you care about performance, you would always use bubble sort on mostly-sorted lists in preference to quicksort, mergesort or radix sort. Actually, though, insertion sort is better again in this kind of situation, so use it instead of bubble sort.


  • Yikes, an awful lot of those were scientific/numerical computation algorithms. The only "pure CS" algorithm on the list was quicksort.

    And the Fortran optimization one is... well, that's not really an "algorithm", now is it? It's a whole bunch of algorithms, and presumably each Fortran compiler does different things to optimize. Again, the numerical/scientific bias is obvious though... who else would use Fortran today? :-)

    Some algorithms (and data structures) I'd nominate are (note that I'm purposely avoiding numerical algorithms here, because the article aleady lists more than enough good ones):

    • Dijkstra's Algorithm
    • Heaps and Heap sort
    • Huffman Encoding
    • Extendible Hashing
    • Converting a regular expression into a minimal DFA
    • Digital trie
    • Bresenham's Line Algorithm
    • Binary Space Partitioning trees, and related algorithms (rendering, set operations, etc.)
    • Knuth-Morris-Pratt (KMP Search)
  • by redhog ( 15207 ) on Sunday June 11, 2000 @10:37AM (#1010320) Homepage
    I don't see random sort anywhere in the list?

    Oh, for those who don't know: It's the winning entry in a competition held at IBM. The competition was about the slowest search algorithm.
    --The knowledge that you are an idiot, is what distinguishes you from one.
  • by magic ( 19621 ) on Sunday June 11, 2000 @10:41AM (#1010326) Homepage
    Here's one: the FFT Algorithm allows transformation between positional space and fourier space (for a 1d signal, like audio, this is the transformation between a "time" based and a "frequency" based signal) and vice-versa in O(n * lg n) time for n power of 2.

    Why is this important? Convolution, the primary operation involved in computing all kinds of neat filters, can be performed in O(n^2) time in time space, but O(n) time in frequency space. (i.e. it's a lot faster to perform image filtering in Fourier space). The problem is that the normal Fourier transform takes O(n^2) time, so you can't get into and out of Fourier space to justify performing filtering there.

    With the FFT, filtering can be performed in O(n*lg n) time, since you can hop into Fourier space quickly, perform the filter, and hop back (the constants work out to this being useful after n ~= 100,000, which is a reasonable point for audio and most images).

    The FFT is a great example of an algorithm that suddenly tipped the asymptotic scales and revolutionized an entire discipline.

    -m

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...