Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Embarassingly amateur (Score 1) 630

It's as if neither Kuhn or Peirce had never written a word, this book. You could only have written it if you wilfully ignored all philosophers of science except Feyerabend. What about Lakatos, Musgrave, Hempel, Hanson, Popper, Latour, Laudan, Thagard? Just to name the A-List. And been completely in the dark about the division of induction/deduction/abduction pioneered by Peirce and followed through by many logicians since. And it leaves out the roles of Bayesian reasoning or model logics (non-standard logics in general). Or the Feyerabend of induction, Taleb. It's cringeworthy in it's bootstrapping approach to history of science and reasoning.

Given that this is slashdot there is no need to talk about NSLs, BBNs, black swans etc as they are frequent topics, but most people don't care about philosophy of science, so here goes

The big mystery in philosophy of science is how measurements and small theories continues to work through periods of high level conceptual change. Since the pioneering work on conceptual modelling in science by Karl Pearson, there has been an understanding that scientists model a simplified version of the world (because experiment taking in everything is impossible, the working scientist chooses what is important, and therefore ignores the rest). Grammar of Science by Pearson is at archive.org and still worth reading after a century. The formation of these conceptual model of the world play a vital role in any scientific work, and the rules by which they are constructed, and verified, are paramount even in the "soft" sciences such as history or sociology. Information science works with a conceptual model of the world that is pretty shaky and periodically revised - google Design Science for the "problems at the core of IS" stuff.
The problem (completely missed in this Randroid text) is that these models are constructed within an epistemological worldview, and that there are periods where there is a disconnect between models formed by participants in the worldview and experimental evidence goes against the groundwork of the model-forming. That's what's going on in all of the historical moment that this book brings up. And while collecting data (which is pretty much what they are talking about when they use the word induction) these conflicts are noticed.

Thomas Kuhn brought the historical treatment of scientific method to the fore in philosophy of science, and began with a detailed examination as to what actually happened at periods of time when such events happened, in particular the Copernican Revolution. You can read the scholarly debate in Isis if you have a JSTOR subscription where you work. This caused a lot of flurry as he was seen to be implying that the role science took for itself as absolute arbiter of truth was rocky (he wasn't), People created rejoinders to his work one way or another, and Feyerabend's writing and teaching (which was more or less continued in a pragmatist fashion by Laudan) can be seen as an extension of that kind of questioning - e.g. if science is so progressive, why was the ether taught in physics books?

Kuhn's work has been carried on in the direction he took by Paul Thagard who realised that there are such conceptual revolutions going on in everyone's life all the time. Studies on how children (and adults in some cases) learn about astronomy show that in many ways the history of science is recapitulated in the mind of the individual. His book Conceptual Revolutions revisits some of Kuhn's cases and looks at how the recreation of the conceptual framework is done in a way that permits the work of science to continue.

Given that Feyerabend stopped teaching in the late 1980s it shows the currency of this work. And turning from the renowned anarchic epistemologist to the Randian philosophical little-leaguer Peikoff beggars belief.

Comment the idea of an album--shellac (Score 1) 502

The album concept derived from an album of 78s as many as twenty per set. So the album structure originally consisted of either the art of anthologizing pieces that would fit on a 12" piece of shellac, or else of finding the right point to break a 20 minute piece of music into chunks that would keep the tension going. while you flipped the disc.

I know it's kind of the ultimate in dead media for people dropping the CD for music files, but there was an amazing art in finding the right point. Now you can get the great performances of the past digitally spliced together so you don't even notice that it happened. People who like rarer classical pieces will understand the skill it took to chop up Warlock's Curlew into such pieces, but if you listen to the contemporary National Gramophone Society recordings it somehow works. For an example of a piece more commonly known, the original release of Rhapsody in Blue (Gershwin) with the Whiteman orchestra (featuring the soloists the jazz cadenzas were written for) has the worst fadeout of all time, utterly anticlimactic.

Comment OT Amendation Re:Audio/Videophiles Beware (Score 1) 397

"Mother of" is a bit of a hack phrase, but if she is a "mother of" she has a right to be mother of FORTRAN as well since she was so heavily involved in getting the higher-ups at UNIVAC to accept symbolic compilers, and so heavily pushed for this kind of formula translation approach to coding.

She was a force behind both the A and B compilers at UNIVAC starting in 1951. The A series compilers were mathematical and led the ARITH-MATIC and MATH-MATIC and were a tributary to FORTRAN while the B series led to FLOW-MATIC which was one of the ancestors of COBOL. (Both were based ultimately on Mauchly's ideas for the ENIAC, filtered by Schmidt et al. in SHORTCODE)

The reason why the DoD put her in authority for the COBOL project was that she had already garnered an impressive reputation on compiler development. Since for a lot of slashdotters COBOL appears to be a constant object of ridicule, it's important to remember how wide her accomplishments were.

Comment In context of history of languages (Score 3, Interesting) 187

CLU drew on the lessons learned with both Alphard and Vers. Alphard was from CMU and written by Wulf and Shaw, Wulf also writing the famous BLISS. Vers was made by Jay Early, whose parser was hugely important in all the compilers of the time. THe language itself (and its V-graphs) was heavily influenced by the Mem-theory of Anatol Holt (who was on the Alogl Committee and was a principle in designing the astonishing GP and GPX systems for the UNIVAC - first languages to explicitly feature ADTs per se. That became ACT, the adaptable programming system for the Army's Fieldata portable computers (portable in a completely different sense to the modern usage. He also hated Unicode, but that was a rival programming system back then. So reading the reports at the time can be misleading - "don't use Unicode on Portable Computers!"). Holt's ideas permeate computing, the notion of making any system of data representation as abstract as possible goes back to him.

CLU was written using MDL (pronounced muddle) which was a protoreplacement for LISP which featured ADTs. MDL was cowritten by Sussmann of LISP fame as a basis for PLANNER which became Scheme, and perhaps more geekly interesting is that is was also used for writing ZIL (and if you don't know about ZIL, you shouldn't be reading Slashdot)

CLU evolved into Argus, but the ideas were also used in the Theta programming system for the Thor OO database, and was also in PolyJ which was (as it suggests) a Polymorphic Java

Another fascinating development of the CLU ideas the SPIL system that Liskov co-wrote at the USAF-sponsored MITRE corp, which was in turn used for writing the VENUS operating system

Liskov has pioneered the notion of abstraction per se in language design for 40 years, and this generics-based approached is now taken for granted. She fully deserved the award for her insights as well as for her determination in fighting the reductionism represented by previous recipients (eg Dijkstra) although opposed by others (eg Iverson)

I have extracts for reports for all language of these on the HOPL website HOPL.murdoch.edu.au (too many URLs to paste in individually). Find CLU http://hopl.murdoch.edu.au/showlanguage.prx?exp=637&name=CLU and follow the genealogy links. And if you haven't yet seen my 4000-strong programming-language family tree it is worth printing out for wallpaper if you have an A1 plotter.

Comment Why no JOSS? JOSS was ten years ealier (Score 1) 188

The origin of the first personal interaction system is surely in the mind of Cliff Shaw, a JOHNNIAC programmer at RAND in 1954, as put forward in the Intelligent Assistant program with Newell et al which resulted in JOSS - the JOHNNIAC Open Shop System. This was the first hackish system, with users describing it as compellign and addictive. Ted Nelson and Alan Kay both credit JOSS as a major inspiration. JOSS was the first sytem that was distributed to ordinary users, the first sytem with online help, the first graphical interface (GRAIL), the first hand input (the RAND Tablet) ten years before Engelbart.
Interestingly, it was also the first recycling of a computer, taking a Princeton Class computer and giving it a workable life into the mid-60s.
JOSS was an amazing system, influencing all major interactive systems, and inspriing the Lincoln Lab work of Sutherland, the time-share system at MAC, the BASIC project at Dartmouth, the AMTRAN maths systems, the LCC - and many many more.

Slashdot Top Deals

There is no royal road to geometry. -- Euclid

Working...