Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Education

Learn The Language Of Math 170

N. Megill writes "While mathematics is not "closed source" in the same way that some computer operating systems are, it can take years of hard work to acquire the background needed to understand advanced abstract mathematical proofs. This is because they are usually presented at a very high level that hides most of the detail, often making them beyond the grasp of a non-mathematician (even a very smart one such as a computer programmer). The Metamath project breaks down mathematical proofs into the finest possible level of detail and builds mathematics from the ground up. Like Linux From Scratch, it can appeal to those who like seeing things built up from first principles. Metamath does not claim to teach you mathematics, just as reading the kernel source code does not teach you how to use Linux, but there can be a certain satisfaction in just knowing it is there."
This discussion has been archived. No new comments can be posted.

Learn The Language Of Math

Comments Filter:
  • Third post for Bob

  • What I need is OOP from the basics..

    Bah.

    I have a shotgun, a shovel and 30 acres behind the barn.

  • This is like saying the best way for newbies to understand a C++ program is for them to decompile it and look at the assembly!

    There is no way to short-circut the "years of hard work" involved in understanding higher-level mathematics. I know it's the holy grail of CS geeks everywhere to streamline the hard work of understanding proofs, but mathematical decompilation is not the answer.
  • In linux, there is a difference between using it and knowing how it works. But you cannot use mathematics without understanding it, and you cannot understand math without also knowing how to apply it.

    So unless there is something here that most graduate-level courses are missing, I don't see how you can possibly learn any faster than by studying at a university.
  • They make us learn all that discrete crap in one semester! Years of studying my ass! If Achilles kills Agamemnon then at least one Trojan is the son of Priam and there exists a Greek who slept with Helen. hahaha!
  • "Even a very smart one such as a computer programmer"...

    Ahhh ha ha ha ha ha ha ha

    Ha ha ha

  • I'd hate to see something like, say, the defintion of a derivative, or the solution to the wave equation taken down to the simplest possible functions (I'm assuming nothing more complicated than subtraction...). That would make War & Peace look like a pamphlet.

    There are times when being able to see everything all at once isn't the best way of learning something. If I lay out out complete blueprints for the space shuttle, you're not going to have that much better an understanding of it (unless, of course, you've already know how to deal with situations like that).

  • Gee, why is there an entire header file called "math.h"? How are files manipulated? How do codecs work? What's one of the areas you should focus on if you want to become a programmer?

    Mathematics.

  • you could be raking in the big bucks for designing web pages, web page catalogs, and first person shoot-em-ups, rather than uselessly expanding the boundaries of humanity's knowledge by intensely studying math.

    Sheesh, can you get any more open source than Math? It's taught in every school, textbooks are everywhere, it just takes time.
  • Are there any jobs going that would require me to learn and apply this ?

    One of the things that has always put me off "real" mathematics is the degree to which its language is arbitrary - for a programming geek this is maddening.

    Anyone know how widespread this Metamath thing is (in or out of Academia) - as with any language if I'm going to invest my own time in learning this I would prefer to have a clearer idea of how long it's likely to be around, what calibre of people currently contribute to it, and how rigorous the review process is.

    All else aside it does look fun though

  • ... that the amount of times you mention open source and Linux in a Slashdot submission is directly proportional to the likelihood that the story will be accepted.
  • I wonder if it is anything like Metamoderation for Mathemeticians

    Math Trolling: x+y=z*troll if x=fp

  • by ReverendGraves ( 233320 ) on Monday April 23, 2001 @09:22AM (#271194) Homepage
    There is no way to short-circut the "years of hard work" involved in understanding higher-level mathematics. I know it's the holy grail of CS geeks everywhere to streamline the hard work of understanding proofs, but mathematical decompilation is not the answer.

    I disagree, strongly. Mathematical proof comes down to three things: knowing definitions, the ability to think laterally, and a compositional style which is both terse and precise. This was driven into my skull by constant repetition by my advisor at University, while I was working on my maths degree. Learn the definitions. Memorise them! That's ninety percent of the work. The remaining ten percent can be split evenly by the acquisition of precise compositional style and by the ability to recognise the application of definitions -- this being the lateral thinking I mentioned above.

    Claiming that writing and understanding mathematical proofs is hard is absurd. It's fantastically easy to write and understand proofs. The difficulty lies in making the lateral cognitive leap from some postulate to a theory dealing with the postulate. The style required to write proofs can be taught quickly, likely in a matter of hours. Learning the definitions can take a lifetime -- if this MetaMath project provides a bottom-up breakdown of mathematical axioms, theories, laws, and maxims, then it serves as a valuable aid. I know that such a tome, electronic or not, would be irreplaceable in my collection... not to mention replacing a shelf full of bright yellow maths texts.

  • yeah, math is arbitrary...

    how big is an int in c++? are they little or big endian? hm...
  • From the page: Notice If you wish to bookmark or reference Metamath, please use "http://metamath.org/". The location of other Metamath pages may (and probably will) change over time.

    If you're gonna kwhore - at least get it right!

  • Building up from the most trivial foundations is really neat. I suspect plenty of students in a Calc class would be unable to identify the axioms their work rests on. It's like half our society today, folks use lights but never understand them, it gotten much hard to fiddle around with your car, etc...

    Wonderful to see one are where folks are making it possible for those interested to actually trace stuff back to its roots.

  • CS majors should knows discrete math anyway! You need to know it for proving
    program correctness and you *definitely* need experience when you're doing inductive
    proofs.
  • Yes, it has -- the beginning of the century some people tried to explain math with the language of logic. Their mylestone book was called "Principia Mathematica" and it took them two pages to prove that 1 + 1 = 2...

    Of course, they managed to do so only at page 56 (or so) in the book. Let's see how this metalanguage will measure up to that :)

    cheers: f2f

    ps: i'm sure someone will be able to give more detail about the people who wrote the principia
  • I wanted to moderate this, but there was no "Stiller"[1] rating for stuff that you think is maybe supposed to be funny but is so simultaneously off the wall and parodically accurate that it creeps you out....

    Ellen

    [1] See "Ben Stiller's The Pig Latin Lover and Parallel Universe Theory", Princeton Engineering Anomalies Research Laboratory Weekly, 2012 January 23.
  • by jsse ( 254124 )

    I see a lot of overlapping work with Symbolic Computation Group's [uwaterloo.ca] Maple [maplesoft.com].

    I worked on Maple many years ago, and it's more like a programming language to me than a symbolic representation of mathematics. Why bother doing the same thing over again with less completeness?

    To be honest, Maple is like hell to me(probably I don't like programming mathematics), but I just don't want to see people create another hell.

  • I always love it when you're reading a proof and it says something like "...after a bit of algebra, we obtain the following form...". This always seem to require a page or two of intense algreba computations plus a few non-intuitive manuipulations for good measure. Nothing like good old mathematical understatement.

    This reminds me of the old math joke:
    A math professor is working on a problem at the blackboard for a while and isn't getting anywhere. Another math professor comes by, stares at the problem for a few minutes, and announces that the solution is trivial.
    The first professor says, "I don't see it - show me," and so the second professor proceeds to explain the solution over the next 40 minutes. "yeah, it's trivial," finally agrees the first professor.

  • Not to mention the fact that algorithm design and complexity analysis are direct applications of high-level mathematical logic to the Science of Computing. Every computer scientist should have a firm mathematical background, if for no other reason than the basis of the field upon the shoulders of centuries of academic mathematics.

    Note that I see a strong separation between the titles of Computer Scientist, Computer Engineer, and Software Engineer. Many software engineers can get by with a minimal knowledge of mathematics. Fewer computer engineers can do so. And fewer still computer scientists can do so... computer science is the theoretical application of mathematics to the problems of computing, not the design of some new killer app. That's software engineering.

    But, then, I'm biased. I have bachelor's degrees in both maths and computer science, concentrating on the theory of computation and number theory. It should be no surprise that I push a mathematical outlook on my colleagues.

  • Well... they should be Mathematicians... but speaking as someone who is both a Mathematician and a Programmer (mathematician by degree, programmer by trade) I can honestly say that more often than not they aren't....

    Most programmers haven't seen or worked with math much beyond basic discrete algebra. This is because the programming world has evolved into this script-kiddie playground, where persons with relatively little programming knowledge and technical ability can use RADs and IDEs to produce software.

    Now I am not saying we aren't without our wizards (and a lot of them are currently in the free-software and open-source movements ;-) ... but it has been my experience that the vast majority of "programmers" out there are really quite novice...
  • >The difficulty lies in making the lateral cognitive leap from some postulate to a theory dealing with the postulate.

    Thank you. This is exactly the problem I had in school for math and what I see alot of people having difficulty with in many areas.
  • This was driven into my skull by constant repetition by my advisor at University, while I was working on my maths degree. Learn the definitions. Memorise them! That's ninety percent of the work. The remaining ten percent can be split evenly by the acquisition of precise compositional style and by the ability to recognise the application of definitions -- this being the lateral thinking I mentioned above.

    If 90% of the work you spent on learning mathematics was memorizing definitions, then I think your undergraduate university wasted a lot of your time.

    I am a graduated math major as well; I called myself a "math major" because AFAIAC, it's for life :)
  • When I took calculus III a few years ago I asked the question "when is not okay to reverse the order of integration in iterated integrals?"

    The instructor replied " this is not something that we can or should get into."

    At the time I thought that it was a serious copout on the instructors part. A couple of years later and several analysis courses later, I learned when it is not okay to perform such an operation. I was taking a class in Lebesgue measure theory at the time, and the amount of machinery needed to understand why it would not work in certain cases was by no means trivial. I doubt that any idea to reduce what was happening to anybody without a few tools in measure theory would be hopeless.

    It might be good to try and bring simple math to the masses but I doubt it will be worth anything for higher level mathematics.

    I am by no means a brilliant mathematician. On my best day I might be described as competent. So I am not being a snob about this just realistic. In order to understand many of the concepts used you have to have worked out a few problems in gory detail. Also, I doubt that I would have understood as much as I do without the help of some very good instructors explaining to me what was going on.
  • If I lay out out complete blueprints for the space shuttle, you're not going to have that much better an understanding of it (unless, of course, you've already know how to deal with situations like that).
    Almost everyone already knows how to deal with adition, subtraction, multiplication, and the like. By breaking down something that they don't understand into something that everyone undestands it may become easier. I wouldn't want to look at extreamly complex equations this way though, because you lose yourself in the scale of the problem.
    =\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\=\ =\=\=\=\=\
  • Most mathematical papers at least have some text to say what is going on. The MetaMath proof was none at all just multiple sets of maths with no description. Even the best mathematiation can make little sense of a proof without knowing what the symbols are supposed to mean. For example take the equation, H Phi = 0 This can mean lots of different things depending on what the symbols are supposed to represent. WIth H and Phi are both real numbers, then it simply means that either H is zero or Phi is zero. But if H is the Hamilitian operator and Phi is an Vector in Hilbert space. Then the above equation is the Wheeler DeWit equation for quantum gravity, which is something alot different.

    All of the above is just a long we of saying. MetaMath people, how about putting some smegging commentry and discriptions on the proofs. Plus why not have each symbol hyperlink to its definition. Its in hypertext so why not use.

  • Kurt Goedel will have a field day with this!
  • it's just 10% memorizing the board positions of all of your winning outcomes, 80% memorizing every previous move leading up to the win, and 10% lateral thinking...

    something that would provide a bottom-up breakdown of the entire game tree of chess would be irreplacable in my chess book collection, and would replace all of the great books on classic openings, endgames, and overall strategy!

    how easy is that!
  • healthy heterosexual jism.

    Actually, my handle is spelt "jsm".

    jsm

  • Actually, I don't think even that can be _proven_
  • After taking a brief look at the site, I realise that this is not a "number theory for programmers (idiots)".

    Even though the Axioms are laid out like a man page, and the steps are very simple, it still reads as kanji to me.

    Even man pages are not useful when you do not understand the symbology or the context of the syntax.

    Maybe one day either I will have the revelation to put everything together... or someone will actually write a site made for the common man, that is for the common man.
  • Learn the definitions. Memorise them! That's ninety percent of the work.

    Anything 90% of which is memorizing definitions is unlikely to be interesting. Or useful, come to think of it.

    Kaa
  • Inductive reasoning is certainly valid in real-life examples. You want to know that if something
    holds for the base case, it should hold for n and n+1 cases. I agree that its not really
    necessary (because you know beyond a shadow of a doubt its correct without doing a proof), but
    its something that one should know as a computer scientist and its definitely valid.
  • While mathematics is not "closed source" in the same way that some computer operating systems are, it can take years of hard work to acquire the background needed to understand advanced abstract mathematical proofs. This is because they are usually presented at a very high level that hides most of the detail, often making them beyond the grasp of a non-mathematician (even a very smart one such as a computer programmer).

    Oh, come on. Is it possible that once, maybe just this once, it doesn't really make sense to try and pin the Open Source Medal on this to make it even more 133t and k3w1? But hey, while you're at it, you should probably mention that you can use mathematics on Linux, too, and that it runs perfectly on both Gnome and KDE, no less! And I heard that ESR uses it, like, every day!

    That aside, the concept of "open vs. closed" applies to mathematics about as much as it applies to C++. Mathematics is a tool. There is nothing that says that the inventor of a revolutionary mathematical equation must then make her knowledge available to the world. She would be just as free to form a company that charges insanely high consulting fees to process data using her secret formula as she could post it on her webpage for all the world to use freely.

    While it's true that the vast majority of all mathematical knowledge is freely available to anyone who cares to look at it, that doesn't make the concepts of "open source" and "closed source" applicable to mathematics itself.

  • Proofs and "mathematical rigor" are ridiculously overused in teaching.

    This is a ridiculous extension of the current misconception that mathematical simplicity and conceptual simplicity go hand in hand, when the opposite is true.

    Imagine if children were taught arithmetic by route of logistics... only a handful of geniuses would be able to make change! Yet go to university and you'll see professors trying to build understanding the way they would build proofs: introduce axioms, show higher relationships that emerge from these axioms, later introduce tricks of the trade for practical , and then, maybe, once (or rather, if!) the student shows a good understanding, talk about applications. You have to take it on faith that years of painful, boring proofs will lead to interesting and useful ideas.

    It's sheer idiocy, and terrifically destructive of promising young minds. Not only does it drive people away from math, it encourages absurd semi-religious attitudes that math somehow has independent existence that we are discovering, rather than being a language invented by man to describe the universe around us.

    Possibly my favorite book is "Mathematician's Delight", which shows the correct method of teaching math: explain applications first (to build interest), practical method second, and then, maybe, if the student shows understanding and interest, talk about proofs that the math works.
    ---

  • Doesn't BigBlue do something like this?

    At least ChessMaster8000 does and its pretty good.
  • That would be "Principia Mathematica" by Russell and Whitehead.

    See here. [stanford.edu]

  • by Spling ( 317404 ) on Monday April 23, 2001 @09:42AM (#271221)

    As a full-time mathematician I'm half-pleased to see the existence of this project, but I'm not really very thrilled with it. It's the big concepts that attract me to maths, the ideas and the big picture, not the nitty-gritty of putting together lots of little logical steps. So as a matter of taste, I don't think that Metamath presents a very appealing view of maths, and I suspect that some people will be actively put off maths by this. If not then fine, but this is my personal feeling.

    Another reservation I have about this is its concentration on axiomatic set theory. This is a subject which tends to draw a lot of attention from the non-mathematics community, in popular science books and so on. In fact it's quite far out of the mainstream of mathematics (a sociological observation rather than a value judgement). I think that the importance of set theory as a "foundation" and "universal language" for mathematics has been far overstated.

    This point of view on set theory is actually increasingly prevalent among the theoretical computer science community - at least, the part of it that I come into contact with. There are various structures from mathematical logic that are far more applicable to computer science than sets are: for instance, the lambda-calculus and categories. Metamath is very reductionist in its approach: it takes the smallest building blocks and shows in minute detail how they can be put together to obtain familiar objects. In contrast, the more popular modern approach is to try and describe things from the top down, e.g. one might look for abstract mathematical structures which resemble the collection of datatypes in a particular programming language.

    So it's kind of nice to see this here, but it's not the face of mathematics I'd choose to present.

  • It's like half our society today, folks use lights but never understand them, it gotten much hard to fiddle around with your car, etc...

    You think that happened by "accident"? What about how in 1900 90% of the world was self employed with hardly any taxes at all, but in 2000 less than 4% of the population was self employed, and taxes are huge and confusing...

    It is the Illuminati, my friend.

    FNORD!!!
  • Okay, so math is hard. We know this already. But posting every stinking proof we can write isn't a good teaching tool, though it's great for human achievement and such. My math education was a pain for one major reason; it was harder than it had to be. The lion's share of professors I ran into didn't know how to teach. This means that, instead of getting explanations, I got dense texts like Rudin (ask your local math geek) pretty much read back to me. While it *is* hard, and that can't be gotten around, it works a great deal better if you explain things instead of the drone of "defintion, theorem, proof" that passes for teaching in some universities. It's great mathematics - and piss poor teaching.
  • It depends what axioms you use, certainly
    1+1=2 can be proven from PA or ZFC. Any system
    where it couldn't be proved would probably be
    too weak to be of any interest.
  • As David Hume might have put it: tell us nothing new, because they are ultimately tautologies."
  • by _ska ( 114561 ) on Monday April 23, 2001 @09:46AM (#271226)
    Ok, for what it is worth, I have two maths degrees, and am working on my third (ph.d). I am afraid either you (likely) or your advisors are sorely mistaken.

    If writing and understanding mathematical proofs is so easy, why are there entire graduate courses dedicated to one or two proofs?

    Some mathematics is very easy to understand. Some mathematics requires years of background to understand.

    After all, much of what mathematics is really about is building abstractions on top of each other. Our only hope of obtaining deep understanding of some maths is to build powerful enough abstractions. This power has a price, and it can take years of study to understand them properly.

    Understanding proofs once you have the correct framework is often easy. Creating that framework, especially if it is new work, can be very difficult. Of course it is easy when you have someone to hold your hand through the whole thing, but that is pedagogy, not mathematics. "lateral cognitive leap" doesn't mean much, except the employment of currently vougue buzz: "lateral thinking". mathematics involves many approaches, you may call some of them 'lateral' if you wish -- but it isn't a notable useful characterization...

  • By the time a student reaches the university level, it is presumed that they have an interest in math. If they are still looking for more application, then they should be taking physics, engineering, chemistry, economics..etc.

  • There is a physics-is-math cult composed of nerd physicists and mathematicians who think they are free to create physics simply by manipulating spacetime equations using what-if scenarios. This is an absurd way of doing physics because these people don't have the slightest clue as to the actual physical processes and mechanisms that give rise to the phenomenon we call spacetime curvature. Math does not create physics. Physics is about particles, their properties and their interactions. Everything else is either abstract or voodoo. So things like wormholes, black holes and time warps are pure crackpottery, glorified mathematical toys (I think of them as math hacks) invented by grown-up nerds for the sole purpose of impressing their peers and amaze a mystified lay public.

    Nasty Little Truth About Spacetime Physics [gte.net]

  • This is because they are usually presented at a very high level that hides most of the detail, often making them beyond the grasp of a non-mathematician

    After an amount of time, one forgets what it is like to be a newbie. To linux, complex mathematics, even to life. This can only be overcome by spending time refreshing your memory by hanging out with users (user-testing), elementary math students, and children.

    One example I have to relate occurred in my freshman calc class a number of years back. The professor had grown accustomed to spending his day discussing high-level mathematical proofs with his colleagues, and forgot what it is like to not know everything, and still be learning. Our assignment was to calculate the volume of a 3-dimensional curved object whose primary shape was a triangle.

    When it came time to figure out the area of the triangle, he went through a 20 minute proof in front of the class which involved something like 8 different variables. Afterwards, I stood up and volunteered the formula I learned in high school, 1/2 (base x height). The professor was so shocked that I came up with the same answer in 2 lines of algebra, that he was unable to complete the rest of the calculus computation.

    I'm no genius, but was there really any reason to try losing his students when the triangle wasn't the base of his lesson plan?

  • Let's see if I can paraphrase Godel correctly: Any formal language which is powerful enough to describe basic arithmetic, is also powerful enough to describe theorems which cannot be proven or disproven within the constraints of that language.

    So any higher math will necessarily require a meta-language, e.g. English, in order to prove its theorems.

    What is a proof? According to the famous mathematician Errett Bishop it is any completely convincing argument.

  • Formal mathematical proofs are usually only useful to mathematicians. For most people it suffices to tell them that the theory is true and has been proved to be so. I don't think it's going to help people to slog through abstract mathematics at unprecedented levels of detail. I think the public would be much better served with concise regular English descriptions of mathematical concepts and perhaps the how and why of certain mathematical proofs. Techies need math, they don't need to try to be mathematicians.
  • Evil, evil slashdot. The server is almost dead. grrrrrrrrr..... (I'm an LFS developer)
  • When I took calculus III a few years ago I asked the question "when is not okay to reverse the order of integration in iterated integrals?"

    The instructor replied " this is not something that we can or should get into."


    This is a perfect example of why we don't necessarily need to know every gory little detail of most mathematical theorems we work with. I'm not saying that there's no use to this metamath website, just that if a complete treatment of every math subject was required at all time, then students would hardly be able to calculate anything and math would probably never get of the ground at elementary school.

    --
    Matthijs
  • . I suspect plenty of students in a Calc class would be unable to identify the axioms their work rests on.

    That's why you take real analysis and topology.

  • There's this paper proposing structural proofs (http://www.research.compaq.com/SRC/personal/lampo rt/pubs/lamport-how-to-write.ps [compaq.com]) that might be a good read.
    In short what the paper proposes is to write proofs in a top-to-bottom tmaner so that you can easily see the outline of the proof without going into details unless you need them to understand it. He also says the method should better prevent errors. (And every self respecting mathematician and computer scientist should know who Leslie Lamport as well as Donald E. Knuth are.)
  • Scientists often employ semi-empirical formulas and approximations without much regard for the underlying assumptions or physical or mathematical bases for the formulas. Something like this could help improve the rigor of work in the physical sciences, and help prevent costly mistakes (such as, showing that such-and-such experiment can't work because it's based on ideas which form an incompatible set of assumptions).
  • I'm not a mathematician. But I'm considered fairly adept at programming, because I have a strong background in that other often-overlooked area of expertise, logic. One of the things I learned when I was studying logic was not to make fucking overbroad generalizations. Such statements are easily spotted -- they include phrases like "vast majority" and "it has been my experience".

    If you were to list each programmer you've met by their name, and provide an accurate index of their mathematical abilities, and a gauge of their "script-kiddie"-ness (is there a qualitative measure of that, by the way?), and then compare it against the unstated but probably large number of programmers whom you know nothing about, then we would have a valid basis for whether or not your argument was anything but a total canard.

    Or we could just dismiss your argument on the basis of its specious generalizations. Which I usually do.
  • by Merk ( 25521 )

    I took Engineering Physics in university so I'm no stranger to advanced math. I know the fundamental theories they're breaking things down to, but this is still some hard stuff to understand. For instance:

    Axiom ax-2 4

    Description: Axiom of Distribution. One of the 3 axioms of propositional calculus. It distributes an antecedent over two consequents.
    Assertion
    Ref Expression
    ax-2 |- ((j -> (y -> c)) -> ((j -> y) -> (j -> c)))

    If the goal is to really make this stuff understandable I think they need to provide some much more basic examples in the style of: If you have 3 apples and I give you 2 more apples, how many apples do you have?

    I think for the above axiom an equivalent would be:
    j: it is raining
    y: the ground is wet
    c: there are puddles on the ground
    If it is raining then if the ground is wet there are puddles on the ground.
    If the above is true then:
    If it's true that if it is raining then the ground is wet, it's true that if it's raining then there are puddles on the ground.

    This basically says: if whenever the ground is wet there are puddles, and whenever it rains the ground is wet, then whenever it rains the ground will be wet and there will be puddles.

    Then again, it is a Monday.

  • no, this site is not going to help a person with no math background to understand mathematics...just like many of the "informative" links on slashdot wouldn't help a complete newbie understand any given operating system or computer language. This site is pretty cool for those who have the necessary foundations.

    but even then, mathematics is (unlike programming) one of those fields that has a threshold -- a limit to how much a given person will ever be able to understand. And that limit varies for everyone. If you have little to no mathematical background, or if you never got beyond basic college calculus, don't go expecting too much from a shortcut like this.

  • I agree with you, but I have a name to live up to.

    Seriously, what I meant was related to something one of the few good math teachers I have had told me (I'm double majoring CS/Math, so I had plenty). That is the concept of never accepting what you are told unquestioningly. This, of course, can be applied to many areas of life in general. But this is the very foundation of discovery. We take it for granted that 1+1=2, just like people used to take for granted that the Earth was at the center of the solar system, for example.

    This was all just a wind-bagged way of saying that any discovery, especially those scientific in nature, cannot be obtained by reading someone else's proofs/essays/code/whatever. Those who are not interested enough to do the proof themselves have questionable amounts to gain from seeing it on paper or their monitor.
  • Rudin's Principles of Mathematical Analysis is a great book, but only after you have studied the basics elsewhere :-). For that the university's lecture notes have proven sufficient to me. After that Rudin is really enjoyable reading, even if it requires some extra thought. I like rather dense textbooks. I dislike most american textbooks I've met (not math though) that contain so much of useless crap that they're boring to read and it is hard to find actual topics from all the examples and showing off hands. Especially teaching-by-examples often sucks. Examples may sometimes be good at introducing a topic and showing a practical examples after one, but there should not be too many of them and topics should not be disguised in the examples. For example Daniels' Digital Design from Zero to One was awful. (Coursebook here on a compulsory course to all "IT" students.)
  • Deep Blue does not exhaustively search the entire move tree. It uses a recursive search, but trims it to manageable levels using heuristics designed by humans. For the opening, it has a database of classic chess openings to follow depending on the play of the opponent. For the midgame, it has a database of practically every master-level game on record for those common board positions. For the endgame, raw calculation is probably enough to solve it outright.

    If chess gets to the point where a computer can literally find a way for either white or black to win no matter how the other plays before the game even starts, that will kill a lot of interest in chess. I suspect though that if neither player makes a mistake the game will always be a draw like in tic tac toe.

    --
    "May the forces of evil become confused on the way to your house"

  • Yeah, medicine is useless.

    Do you have any idea how much memorization medical students have to do? It's why I'm not a medical student. To many it's interesting though, and it's definitely useful.

    --
    "May the forces of evil become confused on the way to your house"

  • Ah, the call of the "pure" mathematician: "You want to apply this stuff? To hell with you, philistine!"

    True pure mathematicians, unconcerned with application, are insane. They don't recognize mathematics as derived from real-world observation, they are just happy little computers, playing with meaningless symbols to solve meaningless puzzles, upset when you point out that there is no such thing as a self-evident truth to make an axiom of.

    Practical applications may not be their motivation, but it's the justification for their pay, and the origin of their field. We, the economic animals that value utility enough to put food on the table, shouldn't defer to their irrationality.

    The one advantage of using these economically insane individuals is, of course, that they come cheap. But while it makes sense to take advantage of them, we must be careful not to let them spread their insanity with their products.

    For mathematicians, give me a Donald Knuth any day, who mixes practical work and theory in his life, for the improvement of both, and is eager to change notations and rules to make them easier to work with and learn and into a better reflection of reality. He has, and spreads, a healthy appreciation of useful math, not a degenerate disdain for "applied" (sullied, tainted) math.
    ---
  • I think I'll go over to my HP Logic Analyzer and press the "Don't Care" button.
  • The original poster may well have been referring to denser books of Rudin, like say Real and Complex Analysis or Functional Analysis, or even Function Theory in the Unit Ball of C^{n}. :-) (All of which are, nonetheless, fine books.)

  • Actually, this is the kind of thing that set Goedel off in the first place. His work proved that any such system must contain some basic axioms that you can't prove within the system.

    That doesn't make such a description language useless, just funamentally incomplete.
  • ...which begs the question, to whom are the puzzles meaningless? Does everyone have to defer to someone else's judgement about whether their work has meaning?

    Back to the main debate, Hardy used to argue that pure math like number theory has no relevance in the applied world, but coding theory is a highly applied branch of mathematics whose roots lie in number theory that didn't have an application when it was developed. Hell, Reed-Solomon codes are solvable by a method that Ramanujan developed, with no "applied" motivation, 50 years before the codes were invented. So who are you or I to say that the pursuit of pure mathematical knowledge is insane?

    For details (ad nauseum), see:

    Levinson, N. 'Coding Theory -- a couterexample to G.H. Hardy's conception of applied mathematics', 1970.

    I don't have the journal title handy...
  • Comment removed based on user account deletion
  • I took Engineering Physics in university so I'm no stranger to advanced math. I know the fundamental theories they're breaking things down to, but this is still some hard stuff to understand. For instance:

    Axiom ax-2 4
    Description: Axiom of Distribution. One of the 3 axioms of propositional calculus. It distributes an antecedent over two consequents.


    I agree totally. I had the same problem - I knew what they were talking about, because I've taken propositional calculus, but I still spent a couple of minutes trying to figure out their obfuscated writing style. They introduce the symbols used in the proofs, but not the language they use to explain them, and if this is intended for math "laymen", well, the average joe doesn't know what an antedecent or a consequent is, folks!

    Cyclopatra
    "We can't all, and some of us don't." -- Eeyore

  • It's the big concepts that attract me to maths, the ideas and the big picture, not the nitty-gritty of putting together lots of little logical steps.

    As someone who has a BS in math and is an active programmer, I disagree strongly. The key to understanding mathematics is understaning the underlying patterns, the grammer. Reading all of those "fuzzy" math books that do not get to the nitty gritty details just give the average person a false sense of understanding.

    I'm so glad to see Norman Megill continue with his project. I was one of the first users of the C version of his program in '93. As someone who was struggling with Abstract Algebra at the time, his stuff breathed fresh new understanding.

    If you are the type who pulls out the assembler to _really_ understand what the program is doing... then metamath is for you.

  • I think the worst problem with the way mathematics is taught (either from a teacher, or a book) is the lack of visualization. Indeed, there's an almost religious commitment to rigidity and formalism, when a lot of time, all we need is a little insight, and thenall the symbols will make sense.

    Take abstract algebra for instance. It's really arcane and odd; What makes a group? Closure, Associativity, Identity, and Inverses. Most people believe that these things cannot be visualized, and so go on and on with really abstract lines like (AB)C=A(BC), eA=Ae=e, A(-A)=e, etc., etc.,. But these things *CAN* be easily visualized, it just takes the slightest bit of creativity. It's easy to teach these things with pictures, where you're just interfacing with people's intuitive understanding, rather than trying to go through a difficult semantics/symbol layer.

    I've taught young kids how to identify groups (think abelian); It's really not a problem, provided you have their attention.

  • Would you believe I got a Ph.D. in mathematics from the University of Michigan without ever expressing a proof in terms of axiomatic set theory? I suppose if you have some real math education, you would. I see all sorts of people posting that "this is useful only for real mathematicians." I have news for you....it's not even useful for all of us!

    If I need to know the proof of something, I go look it up somewhere that I know will have the right level of abstraction, and depend on me to know the real basics. This may involve looking up one or two subreferences, but they too will be more loosely stated. I don't need links to formal theorems that 2+2=4, thanks.

    Where I see this having some applicability (and presumably the source of Wolfram Research's ire, cited above) is with machine understanding. To the extent that it is possible to make a machine "understand" higher mathematics through this sort of reductionism, an archive like this could be rather useful. I am pessimistic about this approach, though. Did anyone notice that there are no nontrivial results in the archive not pertaining to logic/set theory? That's because of the absolutely daunting amount of work in creating such a characterization.

    I tremble at the thought of seeing, say, Wiles' proof of the Fermat conjecture in a form like this. No human mind could hold it all. So what use is this, really, to humans?

  • Has this book's copyright ever expired, and if so, did some unconstitutional law make it de-expire? The three volumes' dates of publication are 1910, 1912 and 1913.
  • Wow,

    Are there any jobs going that would require me to learn and apply this ?

    For some people jobs aren't the motivational force in their life, I know it's hard to believe, but some people value knowledge and intellectual exploration over the pursuit of money.

    That being said. Yes having a strong grounding in math does help with programming. Maybe no company would require you to learn and apply it (though I would run to a company that was more about research flavored things like this if I could), but it could make you a better programmer.

    You summed it up nicely yourself. It does look like fun.

  • I agree; Your right, integration of knowledge is a higher problem. A friend of mine getting his PhD in Physics in Berkeley related a number of papers to me that said that even graduate physicists would resort back to Aristotilian models of the world (forced, natural, and animate motion, but mostly "forced motion") when confronted with problems that didn't match the ones they tackled in books.

    But I think people who had a clear grasp through intuition and pictures would be better equipped to tackle the integration challenges.

    One of my students [speakeasy.org] came to class [taoriver.net], and I asked him, "How's math going?" He replied, "Good, I just did great on a test on the Pythagorean theorem." I said, "Oh really? Did you show the teacher the proof I taught you?" He sort of looked puzzled, and said, "Hunh?" And I said, "Yeah, remember, 'Asquared + Bsquared + 2AB yadda yadda...'?" He said, "That's the Pythagorean Theorem?!"

    The thing is, he knew this proof that I had shown him left and right, forward and backwards, inside and out. We'd gone over it several times. But since I didn't call it "The Pythagorean Theorem," he didn't have that link, and hadn't linked it up.

    I also asked him, "If you have a spaceship at (5,3), and a missle headed toward it at (1,1), what's the distance between them?" He couldn't answer it. Then I gave him a triangle and asked for the length of the hypoteneus. He could do it. But he wasn't able to integrate the two ideas together until I manually showed him how. I remember having the same difficulties myself, a long time ago.

    I think as humans, we're just really bad with our internal communication/thought and crossreferencing. It takes a certain degree of feeling like you have "ownership" of an idea, like you are holding it in your hand, and you are going to weild it like a weapon against all the other ideas and situations in the world. "Knowing how to get the length of a hypoteneus, how can we approach the problem of the distance between two points (positions specified by orthogonal vectors)".

    I guess the thing is to make sure to ensure that students build a framework of interconnected ideas. I think the constructivist school of thought is a good idea; I wonder if there is a way to teach this a little more explicitly.

  • From the site (to be precise, http://users.shore.net/~ndm/java/mmexplorer/mmset. html):

    You will need a browser that supports the "font face" HTML command and has access to the Symbol font. [...] The formula "j R j" should show up as "phi arrow phi". If you see "jRj" or if you see some kind of dark diamond between two phi's then you will not be able to view these pages properly.

    This is a particularly bad way of displaying mathematical formulae, because the meaning of the text depends in a very messy way on the layout (i.e., what font it is in). It shouldn't be the case that just looking at a formula in a different font renders it completely meaningless.

    The pleasant way to use mathematical symbols online is using Unicode [unicode.org]. The unicode character set, which is supported by all common web browsers including Netscape 4, contains all the symbols a mathematician could want (indeed, arguably, all the symbols anyone could want), such as GREEK SMALL LETTER PHI, RIGHTWARDS ARROW, DIAMOND OPERATOR, LEFT NORMAL FACTOR SEMIDIRECT PRODUCT etc..

    If a browser doesn't have a particular symbol, the user will see a mark that shows that a character is missing. What they won't see is characters which are semantically different, like "R" instead of RIGHTWARDS ARROW. If the user saves the page as a text file, the maths symbols will still be present and retain their meaning.

    For more complicated mathematical expressions, the way to go is MathML [w3.org]. However, since most browsers other than Mozilla can't support this yet, though you may be able to get plug-ins. Nevertheless, anything has to be better than encoding semantic information through font choice.

  • Unfortunately, the web page I had with all the visualizations on it went down about a year and a half ago when I move to Seattle.

    Call me up or otherwise contact me [taoriver.net] to hassle me about getting them online if you'd like to see them; I'll try remembering to dig them up.

    In brief, the binary operator looked like a doctors stethascope, with the two prongs attached to points, and the long prong pointing to a final point. The picture (and icon) for closure was a sort of wiggly loop (the set) with the stethascope (3 lines and an arrow) having it's two inputs coming from within the set, and the output going back into the set.

    Associativity looked like... uh... Hard to describe with words. It looked like a U with a U under it, and an upside down U with an upside U under it, attached in the middle; very hard to describe.

    Identity: Draw the set with all the points lined up on the edge, and take one special element and put it on the interior. That's the icon form; for the complete picture, you add in the binary operator, one end on the identity, one end on an element, and the output going to the element.

    Inverses... I'll skip to commutivity, since it's the easiest to draw; it looks like an X. The two elements on the top can switch places (the elements reversed on the bottom). It's a little rounded to show the "trajectory" of the two elements; that they cross over.

    This is not very fruitful; I'll just have to put them back online... =^_^=

  • by edp ( 171151 ) on Monday April 23, 2001 @12:12PM (#271282) Homepage

    Replying to various comments here:

    I don't see where it was suggested Metamath is for newbies, or for newbies to understand math. There are several ways in which something like Metamath can provide value to non-mathematicians. For one thing, they can use it to see that there are links from a theorem all the way down, even if they do not study all those links and comprehend them as a whole.

    Some posters have commented that Metamath doesn't make math understandable. That isn't its purpose. You can make something completely verifiable without making it understandable -- and that is the only way to work with the largest structures. A demonstration that a large office building will not fall down depends upon ensuring that each structural component does its job, and a single person might actually check each and every component and thus know the building will not fall down, but that person could still end up not understanding the building as a whole.

    While I would not recommend somebody learn C++ by studying the compiler's generated assembly language, I would recommend that an experienced expert in C++ learn more, from time to time, by experiencing the compiler's generated assembly language. I know that studying generated assembly code has occasionally provided a useful insight into the high-level langugage specification. Similarly, studying the details is a useful occasional part of a professional mathematician's experience.

    Metamath can (or will be able to) answer questions that are difficult to answer otherwise -- What axioms does (a proof of) such-and-such a theorem use? Do we know of a proof that does not use a certain axiom?

    Today's proofs are subject to errors. Occasionally an error slips by an author and the referees. Eventually, papers may be submitted with links to Metaware-verified proofs, essentially eliminating the possibility of error.

    Further in the future, databases like Metaware will aid the construction of proofs by supplying already-proven lemmas, so the author does not have to expend time proving something already published or reviewing publications to find it.

    Maple and Mathematica are useful tools for some mathematicians, but they are flawed because they have gaps. They sometimes offer deductions that are false, because they make unwarranted assumptions about domains or absences of singularities, or whatnot. E.g., some software engineer may have poured a table of integrals into the software but neglected to make Mathematica prove a table entry is applicable to a particular user's situation before applying the entry. We need something like Metamath that is complete to serve in those times when mathematical proof is desired.

    While nobody travels every road that is connected to our highway system, a key characteristic of the system that makes it useful is that it is connected. Thus each user finds a route to their destination, even though they don't understand where every road leads to or comes from.

  • Programmers are not automatically smarter than
    non-programmers. Programmers can often be thick
    as two short planks.

    Speaking as someone trained as a physicist and
    working as a programmer, it's a real mistake to
    underestimate those from the diametrically
    opposite disciplines.

    K.
    -
  • This is like saying the best way for newbies to understand a C++ program is for them to decompile it and look at the assembly!

    Actually, often the best way to understand a language is to look at the assembly. When I first learned C (15-20 years ago), I looked at the assembly to see exactly what the expressions meant.

    I often think that one of the reasons that we see so many bad programmers out there is a lack of experience in assembly. The way software curriculums are laid out are ass-backwards. They start out with the very highest level of abstraction, and work their way down. No other engineering discipline does it that way. Electronics, for examples, starts with the fundamental components of capacitors, resistors, etc. If they were taught the same way, they would start with plugging cards into PCs!

    Someday I hope we start teaching software students correctly by starting with assembly language and working their way up.


    --

  • Visualization and "understanding" are all well and good, but visualization and human thinking are prone to certain types of error. There is value in obtaining correct results without those errors, and the only certain way to do that is with rigidity and formalism. Therefore you must have something like Metamath in mathematics. Not for everyday use to be sure, but available when needed.

  • "Learn the definitions. Memorise them! That's ninety percent of the work."

    Get a grip on yourself!! I too have a math degree, and if you spent 90% of your time memorizing definitions, I am not surprized you have difficulties seeing how they relate to their uses. Mathemetics does require solid knoledge of its foundations, yes, but that does not mean memorization, it means understanding, and there's quite a bit of difference between the two. Writing proofs is not "easy." (t is fun, however) Otherwise you would not have entire courses devoted to teaching how to do proofs (and yes, dear, i am talking about graduate level courses.). Writing proofs isn't a matter of knowing the mathematical shorthand. And a single theorem's proof can contain within it innovations whence spin of entire field of study. Learn the history of mathematics, and you will see this. Look at the many popular unproven theorems, "fantastically easy"?!?!?! How many hundreds of years were spent on the 5th Postulate? Math is proofs!

  • The point was that if they aren't concerned with utility, it doesn't matter to them whether the problems they solve allow nuclear reactors to be designed, have no apparent use, or improve network communications - it's just another meaningless, but interesting, puzzle. While they shouldn't necessarily be concerned with applications themselves, they should be making their solutions as easy to absorb as possible for those who will make practical use of it.

    To me, "you should pay me because someone will probably come up with a use for what I do, even if I don't know what" is a lot more acceptable attitude than "you should pay me because I am doing something that I appreciate aesthetically, despite the fact that you don't understand it and thus can't appreciate it, because of your bad taste and low intelligence."

    I agree that every branch of math that solves new problems (no matter how abstract) is likely to have applications, but that doesn't excuse hostility to application-centered minds, which I see implicit in the structure of educational materials. Applied math isn't naturally as hard as they make it, by teaching it through proofs.

    They are producing new math, which is good, but they are obfuscating it and the old math (into forms suited to the purposes of researchers, not students or practical users), which is bad. We should take care to only allow the good out, and thus maximize their utility, because they don't care to do it themselves.
    ---
  • That's completely pointless though. The reference to sourcecode is not apt. The "sourcecode" is already there, and it is for people who have the capability of using it (mathematicians). "Metamath" translates mathematical proofs into lower level "language", this would be like translating source code into (say) BASIC. Is that useful? Are there people who are going to be willing to try to understand the (now much more voluminous) source code when translated into BASIC who do not understand higher languages? I doubt it. Translating highly dense source code that makes use of many advanced programming techniques, libraries, specific aspects of the hardware environment, etc. into kajillions more lines of generic easier to read pseudo-code (or whatever) so that someone can hypothetically read the "source code" and see exactly how it works without understanding higher level programming concepts etc. seems like a waste of time. If you want to explain to someone who doesn't have the necessary background in a subject how something complicated works, you do so in English you don't try to explain every little detail. If someone has a burning need to understand every little detail of the proof of Fermat's Last Theorem, or the proof of the Fundamental Theorem of Calculus, etc. they should become mathematicians.
  • laugh> I wish I got to see it..!

    I'd never heard of "Category Theory" [stanford.edu] or commuting diagrams [mq.edu.au] before; thank you for the reply..!

  • Absolutely; but the general trend is to overfocus on the formalism, and then to skimp on (or even discredit the idea of) the pictures.

    We need both the formalism and the intuition for a healthy mathematical education.

  • > So it's kind of nice to see this here, but it's
    > not the face of mathematics I'd choose to
    > present.

    Isn't it wonderful that there's so many different learning styles, though?

    This is sort of what I was wishing for, all those years of putting up with mathbooks that handed me the formulas but failed to explain why they worked. I had to reverse engineer math all on my own through school.

    Of course, after that reverse engineering I usually had a better grasp on concepts than the teacher


    rark!
  • by Anonymous Coward
    I am the author of the story and the person responsible for the Metamath site. Here is my response to some of the comments.

    This is like saying the best way for newbies to understand a C++ program is for them to decompile it and look at the assembly!

    The choice of title for this story, "Learn The Language Of Math," was unfortunate and was the Slashdot editor's, not mine. Indeed I say "Metamath does not claim to teach you mathematics, just as reading the kernel source code does not teach you how to use Linux" which directly contradicts the title.

    Metamath is definitely not for everyone and is no substitute for the normal way one normally learns mathematics. It is meant to amuse you, amaze you, and possibly enlighten you in its own special way. I thought it would be of interest to Slashdot readers. Come on now, wasn't it?

    People interested in formalization and verification of math should also check out Mizar [mizar.org], which has vaguely similar goals but (IMO) requires a higher degree of mathematical maturity to be able to follow its proofs. It tries to mimic mathematical proofs they way they are normally published, whereas Metamath is more like a machine language showing you every nitpicking detail. On the other hand Mizar has many contributors and its scope is much broader. Mizar is a language intended for real mathematicians, not armchair ones.

    I see a lot of overlapping work with Symbolic Computation Group's Maple.

    There really is little overlap. Maple and Mathematica have no concept of mathematical proof. They are tools that help you generate mathematical results, but they do not verify them rigorously -- you are at the mercy of whoever wrote the specific modules being used, and there have been some glaring bugs in the past. A "bug" is impossible with the Metamath scheme, unless you state your starting axioms incorrectly. And they are limited. For example they cannot derive existence results that depend on, say, the Axiom of Choice.

    Congrats... We've been spammed. Instead of mentioning money to get our attention, "Linux" was the repeated hook. Gee Norm, If you're going to pull the wool over the Slashdot community's eyes, the least you can do is post anonymously.

    I made no attempt to hide the fact it is (mostly) my project. I put my name as the author of the story and my name is on the web site. Is there a rule that says one can't discuss one's own work? It should be considered on its own merits, perhaps more suspiciously than usual since I am the author. I certainly derive no financial benefit from my site. I mean, is it really worse than the stories that tell you how to hack a ball-point pen?

    As for Linux being the hook, I think the kernel source code analogy is a good one, to suggest specifically that Metamath is not the best way to learn math (contrary to the story's unfortunate title). Instead it is like exposing what is under the hood, very analogous to seeing the source code for a program that otherwise is like a black box.

    MetaMath people, how about putting some smegging commentry and discriptions on the proofs. Plus why not have each symbol hyperlink to its definition. Its in hypertext so why not use.

    I attempted to put a meaningful comment in each of the 3000+ theorems, with cross-references to textbook theorems and exercises whenever I could find them. The textbooks should be the real source if you really want to learn at a high level, and the Metamath proof a supplement in case you get stuck. Metamath is not intended to be a stand-alone teaching tool, although a few people seem to have attempted to use it as such. As far as commenting the proof steps, they are unimportant other than to demonstrate how the theorem was arrived at; by drilling down you can convince yourself to any degree of satisfaction. As far as symbols, there is a "syntax hints" underneath each proof that explains them.

    The pleasant way to use mathematical symbols online is using Unicode.

    Thanks, I will consider this. A few years ago when I started this Unicode was not supported by most browsers.

    --Norm Megill

  • It would seem more likely, but there are many simple games you could devise where the second player can force a win. I've had problems like that in "math challenges" back in high school.

    --
    "May the forces of evil become confused on the way to your house"

  • Yes, a reductionist dose not normally make a good expositor and it is just the *wrong* way to think about mathematics, but diffrent types of exposition are required for diffrent people. There will be a few non-math people who will understand more via the reductionist approach.

    Example: Freshman year of undergraduate, I was forced to explain object oriented programming to a friend by telling him what assembler would be produced by C++. Fortunatly, he eventually "got better" and now understands things.

    Now, a copy of the Encyclopedic Dictionary of Mathematics (highly non-reductionist) with hyperlinks would make a great source of entertainment for math majors.
  • Reverse engineering math is one thing, but an extreme reductionist approach is another. Suppose that someone said "Physics is easy, just start with waves and particles and the following mathematical models of the four fundamental forces". You're not going to get very far, and what you'll learn is not going to help you synthesize new knowledge.

    I agree that different approaches is a good thing. But there's a reason _NOBODY_ studies aximatic set theory (in sufficient detail to describe mainstream mathematics) before studying fractions. Do you really know what a Real number is? However, it's very likely you've gotten along just fine without that knowledge.

    While I am familiar with one method of defining Real numbers, I don't think of real numbers in terms of their definition unless absolutely necessary (i.e. only for certain classes I took a while ago). Thinking of Real numbers using Dedikind cuts, thinking of Integers as recursively nested empty sets, rarely does anyone any good. Furthermore, these definitions of types of numbers were created formalize preconceived notions of what Real numbers and Integers were, for the rare cases that demanded such formality. Humans used Integers for a long time ( multiple thousands of years) before creating a formal definition for them.

    Oh, and did I mention that the formal definitions of Integers I'm familiar with require Peano arithmetic or similar, which in turn requires a knowledge of certain types of formal logical systems?

    Proofs are going to get extremely long if every occurance of Modus Ponens is labled...

    -Paul Komarek
  • Alright then, I assume you know the definitions of continuity, limit/cluster points, compactness, and the statement of the Heine-Borel theorem. Prove the Heine-Borel theorem for me without reference to texts. I'll give you a hint -- there's not a great deal of "lateral thinking" involved, just use the stuff I've already mentioned (unless you want to drop down to set theory first...).

    Maybe you're so super-smart that you'll get this "easy" proof straight away. It took me two weeks, and was anything but easy. It was gratifying, though. =-)

    -Paul Komarek
  • "After all, much of what mathematics is really about is building abstractions on top of each other."

    I will have to disagree with that point. Mathematics isn't about building abstraction on top of abstraction, but finding the simplest abstraction posible for a given concept. Any single topic in mathematics is worth a semester to study, but if you spent a whole semester studying one proof it must not have been that elegant.

    bash-2.04$
  • I think you're bordering on a simple, clear point. However, you didn't spend enough time simplifying your post to make it clear. Since I've spent a lot of time thinking about it, I'll put some words into your mouth and hope we're talking about the same thing.

    "Mathematics without effective communication of mathematics is useless."

    Finding a succinct statement like that requires a lot of work for me, but saves a lot of people a lot of time. Just like what you're complaining about in your post.

    Effective communication of mathematics is extremely hard. In fact, it may be reasonable to argue that mankind's primary progress in mathematics has been the codification of some basic mathematical tools likes limits and sets. The communication of mathematics has been helped by things like the printing press, LaTeX, and now the ability to easily share PostScript via the internet. But I think it is reasonable to argue that finding useful codifications of mathematical ideas is a fine way to describe the work of mathematicians.

    If we start with Egyption fractions in 3000BC, humans have spent 5000 years of work to arrive at the point that we realized extreme formality may have occasional uses. Deciding when formality is needed, and how much formality is needed, depends upon the assessment of your target audience. After deciding this, you still have to convert your ideas into language for your audience. This can take a lot of time, and affects your productivity in other areas. It's a difficult trade-off to make.

    That said, my post is now long and rambling. I could go back and edit it, to help people read it. Or I could work on a device driver problem I'm thinking about. Since this is my fourth post on this article (sorry, everyone ;-), I think I better cut my time losses and not edit this post.

    -Paul Komarek
  • "Actually, often the best way to understand a language is to look at the assembly." I'm sure you're aware that this is only useful if you have knowledge of computer architcture at some level.

    Consider your electronics analogy. Are capacitors and resistors really the fundamental parts? Or is the drift speed of electrons through ceramic materials a better place? With respect to this article, the real question is where to draw the line for mathematics. I think that most practicing mathematicians or mathematicall-oriented science professionals would suggest that Metamath appears to draw the line at the "electrons in ceramics" level, rather than at the capacitor and resistor level.

    -Paul Komarek
  • Somewhat conversely, I suspect that calculus students would most often not be able to spot the mean value theorem if presented with a highly detailed proof of it.

    -Paul Komarek
  • "I'm no genius, but was there really any reason to try losing his students when the triangle wasn't the base of his lesson plan?"

    I think the professor in your story didn't have a lesson plan. You describe incompetence and laziness, not a problem of high-level abstraction!

    -Paul Komarek
  • I don't mean to speak for the creators of Metamath, but I think the coolness of their project has little or nothing to do with pedagogy.

    -Paul Komarek
  • There is no way to short-circut the "years of hard work" involved in understanding higher-level mathematics. I know it's the holy grail of CS geeks everywhere to streamline the hard work of understanding proofs, but mathematical decompilation is not the answer.

    Actually, I had to do exactly that for a friend of mine during freshman year of undergraduate, but he was an *execptionally* wierd case. Plus, he now goes about learning things the "right" way.

    Anyway, the reductionist approach is good for cracking one or two hard nuts, like my friend, but it is not mathematics and I would claim that you do not really "understand" anything from the reductionist approach. Reductionism should only be used by expositors when they feal that they need to bully/force someone to open their minds.
  • It is about time someone tried something along these lines. Assume that we put *all* known math proofs into a single database. Imagine what a power tool that would be for mathematicians! (well, once they get used to the idea :-)


    - It would allow you to really check your new earth-shattering proof. No more working on one for years, presenting it to the world to great acclaim, and withdrawing it a year later because somebody finds that tiny flaw (as has happened to one of the Ferma's proofs). Instead, just feed in your result to the database, hit "verify", and presto! a clear yes/no answer (and a pointer to the problem step in the proof if its a "no"). Keep in mind *verifying* a proof is trivial work for a computer - it is coming up with one which takes genius.


    - It would also make it easier for you to construct new proofs. The computer would be able to automate much of the "dirty work" involved. Yes, math is an art but even in the most inspired proof you have to laboriously construct proof steps which are "obvious", boring, and vital (for ensuring the proof does indeed work). A computer would be able to fill these in for you, using a not-that-bright theorem prover. Everyone doubting the usefulness of this should ask grad math students about the assignments they get from their professors - physicists too, coming to think of it :-)


    - You could also use spare CPU cycles globally to look for proofs for interesting theorems - something along the lines of the seti@home project. Brute force *does* work, if you have enough of it... and, of course, it would give the AI people a wonderful playground for trying out their stuff. This is a much tougher problem then chess - but a much more useful one to solve.


    Of course, it would be a whale of a project. It would take some serious commitment from math departments... So the best way is to start with some specific sub-field of math, which the Metamath people have done. Moaning that this isn't "all math" yet, or that it isn't "the most important part of math", is silly. You have to start somewhere. Their web site states they are already expanding the project to other fields.

I've noticed several design suggestions in your code.

Working...