Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
NASA

Journal: The end of the Space Age

Journal by JetScootr
Today, I saw that the space age is really over. I was driving through Nassau Bay, on the opposite side of Nasa Road One from the Johnson Space Center, and saw they were gone: a dozen or so office buildings that housed the space program contractors since the very beginning. Although NASA gets almost exclusive credit, much of the space age happened here: Martin Marietta, Lockheed, Rockwell Rocketdyne, a score or more other contractors who were the backbone of America's push into space. Humans got to the moon because of what happened here, just as much as what happened across NASA Road One. My father worked in one of them, and virtually all of Mercury, Gemini, Apollo and Skylab were made possible by the scientists and engineers and managers that worked there. Now it's all empty fields being prepped for new construction. A few miles down NASA Road One is a historical marker commemorating the Japanese farmers who lived here a century ago. But no marker remains here where humankind planned the conquest of space.
There will be new contracts, new programs, of course. Eventually, we may get to the moon. My experience suggests that it won't be in the next half century, not meaningfully, anymore than our reaching the moon had real meaning a half century ago. And it won't be the same culture that does it. The space age is over, although humans' use of space is just begun.
It's also the end of my own space age. I have left NASA after 29 1/2 years, driven out by contract changes and my own personal growth and progressiveness. I no longer have the heart for government work, to try to "fix the system from within", etc. I don't know what's ahead specifically, but I'm sure I'll have more energy and enthusiasm chasing dreams that I had almost forgotten rather than those the nation has forgotten.
User Journal

Journal: My latest conspiracy theory

Journal by JetScootr
This one'll never get the attention it deserves. What's it got? It has: gubmint involvement; Loch Ness monster; the oppression inherent in the system; etc.
What brought this revelation to me was the Bloop. To digress a little, the bloop was a noise heard in 1997 that seems biological in origin, but was heard over a range of 3000 miles. No one knows what caused it.
Fads and fashions come and go. When I was kid, ESP was a fading fad, and pyramid power, the bermuda triangle and the Philadelphia Experiment were rising stars. They faded too, replaced by crop circles, and I don't know what all other stuff. When I read about the bloop for the first time this week (how did I miss it for 10 years?), and saw that it was freely and completely described on a gubmint website, it dawned on me that the gubmint really has a vested interested in stuff like this.
Area 51 is a red herring. If you were running the Pentagon's black ops, would you develop flying sharks wid friggin laser beams in the one place on Earth that is most watched by the most whackos who are the most likely to publish their findings? No, you'd act as if you were, and then develop the really cool stuff somewhere else.
It's all about distraction, like a stage magician. The gubmint scours all of science and tries to find inexplicable, but clearly documented phenomena to feed the fringe market. It keeps the most annoying propeller-beanied/hyper-mouthed/camera-toting twitch-eyes out of the way. While they're chasing fake ufos, aliens and secret aircraft, the real ufos, aliens and secret aircraft are worked on unnoticed.
Anyone who realizes the gubmint is oppressing lunatics in this way won't be listened to; The lunatics won't admit they've been so thoroughly snookered again, and reasonable people will think, "here we go again with another conspiracy theory".
That's why no one will comment on this here, and I can post this without any fear of
Security

Journal: Quantum entanglement and encryption.

Journal by JetScootr
Something has occurred to me while reading this Ars article on the "quantum crack that wasn't".
Everyone is gaga over the 'fundamental security' of quantum encryption systems, due to insurmountable quantum effects.
My thought is based on 'scientific philosophy', that is, unproven conjecture that sometimes leads to enlightening insight or new principles.
Larry Niven's ringworld science fiction series (and the shorts that led up to it) include a perfect force field as a device. It is perfectly reflective, perfectly immune to all effects of this universe, hence, not even time takes place inside it. In one story, live aliens from a billion years in the past came out of two such deactivated fields. (See "World of Ptaavs"). Here's the thought that Larry expresses: Everything in this universe affects everything else, however minutely, so the only perfect force field is one that puts the protected object(s) outside the universe.
The universe can be viewed as a massively-entangled set of particles whose basic 'knowability' results in the flow of time. Boy, I hashed that thought something awful. That is, it is impossible to quantumly entangle just two photons - the whole universe is involved, and can't be excluded. My conjecture: Eventually, a way to 'add' an entangled particle to a quantum security system will be discovered that will enable nearly-undetectable eavesdropping on a quantum data stream. "Nearly" undetectable, because it will also be vulnerable to the exact same effect.
And so the crypto-security arms race will continue forever...
User Journal

Journal: tref naut loff

Journal by JetScootr

lspr wevd kksi laoz plrt ddkv kmwe llad crwt mdr4 lsdw
7
sld kkd rem ddq idx bir xoo soq pas zmx dkw riu jnk
pho dkq mnd awk mdn wic kdr mny lse oib uyq xdp lkd

Debian

Journal: The Idiot Speaks

Journal by JetScootr
Here's a larger scope view on my rants here:
This is a journal of my switch to Linux, and it is much more toxic than I really feel (the journal, not Linux). I am deliberately NOT filtering my rants to save time and get real usability data in usable form, with facts of my experiences. I'm also collecting hardcopies of website helps I get, lists of sites used, etc. I will stick facts and my own opinions, but will collect hard data for anyone interested enough to fix whatever it is I gripe about. Hopefully, at least I can collect it into a useful form afterward.
My personal user history with MS:
  • 1979-1981 Apple ][ computer, "Applesoft" basic, didn't like it. LOVED the "sweet-16 interpreter" and Integer Basic.
  • 1981-1991 Ms-Dos 2.11 thru wudevver. Win 3.1 thru wudevver. Wudevver.
  • Mid-1980s I was getting to dislike Windows not doing things my way. I also distrusted the MS software contract, but didn't really get the bigger freedom picture yet.
  • 1991: Linux, and the freshening breeze of hope. I still hate GUIs.
  • 1996: Win95, and I almost liked Windows. Visual Basic, and I did. But after this, I increasingly disliked & distrusted MS and Windows.
  • 1997: I considered eliminating MS from my home computing setup.
  • 1998: I committed to doing so.
  • 2008: I swiftly put in motion plans for the switchover.

These rants are my stream-of-consciousness diary of doing so.
I'll be setting up mythTV and the rest of a home entertainment system by year's end. I'll be switching entirely to *some sort* of HDTV for zombiemode entertainment. I'll be finally completely setting up a home network to MY specs, not MS's. This twin-Dell ubuntu process is the first step. What I learn here will be used for the rest of my home.
That's why I'm doing this.

Debian

Journal: Well, yes I am an idiot. Sometimes. 1

Journal by JetScootr
Reviewing what I wrote yesterday, I realized I was way overharsh towards the creators of ubuntuguide.org and ubuntugeek.com.
My rant was aimed at them, but really, my piss-offedness has to do with the overall new-guy-to-Linux situation. Their efforts are not to be despised, even if less than what I needed. To those I hit with this, I apologize.
A new guy to Linux shouldn't have to dig as hard as I've dug, just to get the compiler to work or to find out how to add stuff to the box.
richie2000 - thanx. In my next post, you'll see I discovered the info you posted. However, I would absolutely dispute your statement that ""Universe/Multiverse" are "defined" on the repository add/edit screen in Synaptic". They ain't, at least not in 7.10, which is the only version I got. The fact they weren't is why I went on the info safari in the first place. I saw all this cool stuff in 'universe' but .... well, I'll explain in "Idiot part II".
The reason I'm ticked is cuz, like the title, I'm really NOT dumb. I'm very good at figuring things out. Read what I said I've done, and add this to it: I've never finished even one year of college (background: I grew up in a 3rd-world state [Texas], and had to deal with poverty and severe ADHD / HFA).
If I'm having trouble with this, how's "Joe Sixpack" gonna be able to make the jump?
I'm hoping by my rants here to aid myself (and others) in nailing down what went wrong, systemically, so when I join the F/OSS team for real, I'll have a worthwhile target to aim at.
Realistically, even though you mean to help, richie2000, I'll ask that your comments NOT include guidance in what to do next - I want to find ALL beartraps the hard way, the way a person not like me will. So far, my list includes: shopping-mall 'you are here' kind of map, that suggests to *any* newbie of any skill level or goal set, HOW to get where they want to go; This information should cross all company, distribution and application boundaries. For example, now I'm gonna have to find out the scope of the term 'universe': just Ubuntu? or all Debian distros? or all Linux distros? (Just an example, don't tell me!)
The list also includes: setting up a developer's system with the normal libraries. I ultimately did get gcc to work, but there's no X headers that I can find anywhere on the box. Also, the nautilus 'search' tool has never found any file, even when the file is in the current directory and I use every fricken letter and no wildcards - what's up with that? "hw.c" should be able to find a file called "hw.c" in $cwd. ('hw.c' is my 'hello world')
Debian

Journal: I'm Not An Idiot 1

Journal by JetScootr
I'm a programmer. I've written websites, databases, device drivers, network package distribution managers, compiler build systems, process automation systems, AI apps for analyzing discrete realtime systems, real time kernels and debuggers, and more. Languages used in the last 10 years alone include C, C++, HTML, SQL, CSS (three entirely different languages with the same moniker), ECL, JCL, FORTRAN, Radial, 3 languages I created myself (oops, add them to the first list), assembler, pascal, Visual Basic, Javascript, csh, sh, probably others I don't recall. I've worked on various flavors of Unix, Univac's Exec, MVS/JES3, ConCurrent's OS/32, Dos 2.11 thru latest windows, Gould, Chronos (remember that?) and others. No false modesty, after 30 years, I'm better experienced and skilled than most of the people using Linux. And I'm not so old that I'm afraid of new tech.
I'm in the process of retooling my personal net to Linux. Microsoft will have no home in any device I own when I am done. And here's why I'm frustrated and pissed.
The first of the new hardware: Two Dell desktops with Ubuntu 7.10.
Day 1:
Get'em out of the box and plug'em in. The hardware works. I expect nothing less from Dell, having used their products before.
The only docs I get with the boxen are legal disclaimers and 'plug A into socket B' assembly instructs. Also expected; Dell has been dragged kicking and screaming into the Linux market, and is still pouting like a two-year-old in providing non-Microsoft solutions.
Dell's attitude towards Linux has been approximately the same as the RIAA's attitude towards P2P software, although Dell tries to conceal their disdain with market double-speak. Note any Dell droid reading this: You ain't fooling NObody.
The *ONLY* reason I bought Dell was cuz I wanted the hardware to work without me having to do any thinking or research. Right now, I want to focus my efforts on Linux itself. When I build mythTV and the rest of a home entertainment network, I'll be more discerning about getting open products and vendors who are willing to support me.
I poke a stick at my new jellyfish, discover 'synaptic' is how software packages are installed. "Package" is not defined; I'm assuming it means binaries (all of them) needed to run an application. Shared libraries are a question mark: Does the package for 'frozen bubble' include the graphics libraries as well as the game executable itself? I also see a menu item 'Software Sources': Not clicked cuz I'm not interested in source code yet (More on this later). Ultimately on this first day, I am unable to get anything installed, even from the DVDs that came with the boxen. All network devices seem to be turned off or not installed. USB thumb drive doesn't work, but Ubuntu says something about it being defective. Maybe so; I only have one to try with. End of Day one: Not ticked off yet.
Day two: Trying to get gcc to work. Various Linux help guides online are good for educational purposes, but useless when setting up the compiler/ linker/ header files / libraries, etc.
gcc can't find the headers in /usr/include cuz they ain't there (even tho the Dell person I talked to on phone said they were). Are they in a different place than on Unix systems? Do I need to set environment variables or paths, or use gcc command line args to get'em? So far as I can find, there ARE NO DOCS OF ANY KIND online or in the ubuntu distro that tell me how to get these headers.
OK, skip it for now. Find Wesnoth.org for me and the scootrpup. Which binaries? the site says it varies, check with your distro's package. I have ubuntu 7.10, which wesnoth download page says Wesnoth Version 1.2.6 is in the 'Gutsy (7.10) universe repository'. This is the first time I've heard 7.10 called 'Gutsy Gibbon'. Lucky for me, Wesnoth page equated these terms. Wesnoth.org links me to ubuntuguide.org, the "Ubuntu Starter Guide".
What's the 'universe repository'? I've seen it mentioned in msgs in my box as I poked around trying to install stuff without blowing everything up. ubuntuguide.org uses 'universe' without ever defining it, and in fact, most places I encounter the term also leave it undefined. BTW- ubuntuguide.org seems to be a single page only. It's not very helpful and soon I leave it behind. Found a link to ubuntugeek.com, search for 'universe' top link is 'how to add universe and multiverse...', still no definition. It says 'edit file X and uncomment everything you need'. THAT'S A HOWTO? This was obviously written by an idiot and proofread by a prawn-brained goob. What's a comment and what's not? How do I know what I 'need'? What are the things in file 'X'? What can I put there that isn't already there? How does this website KNOW that what I need is already in the file but commented out? I think you can see what other kinds of info are missing from this 'how to' guide.

Somewhere on ugeek, I find a link to an ubuntu wiki which actually defines the term (somewhat) as being one of 4 parts of the "ubuntu s/w repository", but again, leaves more unsaid than it says.
Something of a side note: Again and again, I see notes like "This describes how to X in versions Y thru Z with application A. To do the same damn thing in M, versions N thru Q, or with B, go here: *link*". Peepl, get yer s..t together.
Day two will be continued in next journal entry.
Encryption

Journal: Can Alife fix NIST predictable randoms?

Journal by JetScootr
As I understand it, a random number generator, from any one given state, will generate only one single string of randoms, the same every time that state is used to seed it. Thus, if one knows the initial state for the entire computer system's random number generation, one knows how to calculate every random in series that computer will come up with.
For example, if all new PCs use exact same initial state on the first boot, then all PCs would generate the same series of randoms using the same generator.
A recent random standard produced by the US govt is said to be predictable by using a second set of state data. I don't know if it is true; I tend to believe the experts more than the US gov.
I don't know if it is even theoretically possible to defeat such a backdoor, but it may be possible to make it less useful.
One idea is to use artificial life to generate seeds. The alife exists in a virtual world with varieties of plants and animals. All evolve and reproduce genetically. Part of their DNA is a seed for use by the OS's random generator.
Combining DNA from two alifes generates a third seed distinct from the parents. Variable timing, events and circumstances in the virtual world causes alife to spawn new seeds, like eating or encountering other alife.
The virtual world is constantly producing randoms as alife live, breed, evolve and die. Many thousands are generated between request instances by other applications in the same computer.
If the world system is robust enough the activities (and the seeds) of the alifes are difficult to predict in a very short time.
Of course, this idea could be simplified to just use large sets of data and genetic algorithms to modify them, but what's the fun in that?
The virtual world provides an API for randoms. The API allows an external application to 'check out' one of the alifes. In the client app, all the alife does is generate randoms, and if requested, give the client its most current. When the client has all the randoms it needs, the alife is checked in or discarded.
The seeds provided to the generator become almost as random as an effective generator itself would be, making backdoor-predictions difficult.
I don't have sufficient math to prove this would effectively defeat a backdoor in the random generation tool, but I suspect it would be helpful.
Is the backdoor sophisticated enough that this method is pointless?

Any math wizzes out there care to comment on this idea?
User Journal

Journal: I'm...so...*sniff*...honored....

Journal by JetScootr
Recently, someone(s) in the ./ crowd moderated some comments of mine on a thread -1 troll...three times. I wasn't trolling, so I take it by this action that I hit a nerve with a telling point.
My comments were to the effect that when prosecuting a teenager for 'hacking' on decades-old security flaws, the company that didn't fix the flaws should ALSO be held responsible. (Sorry for the shout, I wanted to be sure that you understood I wasn't letting the script kiddie off)
Looking back at it, I realize this was actually a multiple.
a> The false dichotomy of who's to blame, and
b> the strawman that computers are hard to secure, therefore, security problems should be prosecuted on the basis that the intruder is solely to blame, and
c> Companies can be held responsible by lawsuits.
The truth is, phone companies allow phreaking to occur cuz it is easier to prosecute than fix. Phreaking used to be as easy as playing sounds into the mouthpiece of a pay phone. Allowing the flaw to exist for decades shows the company would rather use criminal records to ruin people than use a little more engineering to prevent nuisance losses.
Companies require motivation beyond just loss versus cost to fix. The human cost outside the company's balance sheet requires greater consideration than company profits. Lawsuits don't fix this.
The story of David and Goliath is in the bible because David took such a risk, and winning was (literally) such a long shot. So it is with lawsuits by individuals against big bizniz. It is unreasonable to expect every individual harmed by a company's (in)actions to have David's courage.
I'm putting this here cuz I really don't want any more unreasonable damage to my karma - the troll moderations were unwarranted.
I've never been down-trolled on ./ before, I thot I was being ignored.
It's funny.  Laugh.

Journal: I've been stupid

Journal by JetScootr
Not sure why I'm writing this, but here goes.
When I was ten or so, I had an electric train set. I also had 4 older sisters who were (are) smarter than me. They loved to play tricks on the mugwump (as I was tagged). One sister (unnamed cuz they're all guilty) managed to convince me that:
a> electricity always flows in a circuit, like battery pole to pole, thru the flashlite bulb, etc;
b> The electric cord for the train's transformer had TWO prongs, and required a circuit also (I think you can see where this is going)...
c> Therefore, if I touched only one prong while plugging it in, I wouldn't get shocked.
Just so you know, in case you're ever drawing a comic or something, it feels like "BIZZB BIZZB BIZZZB BIZZB ...", not "ZAPPPPPPP".
On a trip to visit relatives, sis was at her finest. Aunt Marie had gotten a big porch swing, big enuf for all 6 kids. Of course, when grownups weren't around, it was only big enuf for one at a time (or two, in the case of the twins). I got up early in the AM (I was 12 or so), and was swinging away, chewing a big piece of grass like I'd seen my uncle do the day before (I thot it looked cool).
Sis wanted swing. Rules of the house: Whoever gets it first, wins, and others MUST wait without complaint (or suffer Mom's wrath). She loitered for a few minutes, obviously trying to figger out how to evict me from the swing. Finally, she said "If you keep eating that grass, you're gonna turn green."
OK, I was 12ish, but not totally stupid, and said so, she wasn't gonna fool me. She added "If you spit, you'll see you're already turning green on the inside!"
I did, and sure enuf, my spit was green, I was turning greeeen...."MAAWWWWMMMMM!!!!!!" I went running inside, and Sis got the swing.
Programming

Journal: Thuds and curses

Journal by JetScootr
I've recently been writing unit test code at work, for work, based on preliminary experience with thuds. It's going a lot smoother at work than with thuds, and it took me a while to realize why:
I don't control requirements of the code I write at work, so I do almost no "rethinking" ("inline design refactoring") while writing the unit test code for the product at work.
I should note that I'm writing test code for product that has already been 90% completed. Due to the overall complexity of the product, writing test code was simpler than writing unit test plans and scripts and then doing and tracking them. I know the unit test code should be written first with agile development, but I don't control the software development lifecycle process at work.
The unit test design thinking/rethinking process with work code goes like this:
  • The unit test code is hard to write, so is the test that I'm writing too complex or is it the product code I'm trying to test?
  • If the test is too complex, simplify test, rethinking done.
  • else look at the product code: is the product code in too big a chunk? (nearly always, "yes")
    • then refactor product code into smaller chunks
  • else is the product code too complex? (only sometimes "yes")
    • then refactor product code into simpler chunks
  • else am I vague on the requirements of the product code? Surprisingly since I wrote this code myself, the answer to this is sometimes "yes".
    • then relook at the assembler that I am rewriting to gain greater insight into requirements, then refactor product design and code accordingly.
  • Refactor/redesign unit test and write code for unit test

With thuds, when I run into a difficulty with the test code, I rethink too much. That is, I rethink these topics:

  • The unit test code is hard to write, (so see above list up to but not including "am I vague on the requirements....")
  • Am I vague on the requirements of the thud code? If so, then
    • rethink the requirements I am coding to
    • CHANGE product requirements
    • and "refactor" (actually, "change") product design and code.
  • Since thud's unit test requirements have changed, redesign and recode the unit tests, including some that are done already.

Now that I realize that I was doing this, I understand somewhat what was happening. I stopped liking the thuds project fairly quickly, because of this and the display problem (see next journal post). I'll change my ways. I will be re-starting the thud project soon, and will begin updating this log of my explorations into XP.

User Journal

Journal: Thuds delays

Journal by JetScootr

Thuds has been delayed, but not discarded. I'll update when I can. Complicated and boring work and home situations are the cause - I don't have as much free time. If anyone cared, sorry, but I will get back to it - probably in the next two weeks or so. Here's where it is:
> I've simplified/sped up process a bit, by conglomming more work into each iteration.
> I've failed to successfully develop the habit of "test first, always", but am still "testing first, mostly". Mainly, it's hard to figure out where to draw the line on what to write tests for. I've come up with a sort of fuzzy definition, that I need to work into a more formal spec. Basically, if it takes longer to write the test code than to bench check for a bug condition, then I skip the test code. Formalizing this will be: Any skipped test condition will be documented in the design spec as a manual test step. Importantly, this is to be used only to skip stuff that is too simple to test for, not to skip complicated tests cuz I just wanna get on with it. The complicated tests are where the money is - starting to write a complex test forces me to rethink the function, and usually, to break it up or simplify it.
> Thuds have a world now, but all they do is stand there in it and grow old and die. Sad.

Programming

Journal: Thuds, new iteration 1"I Thud, therefore I am."

Journal by JetScootr

Design
Thuds are a standard linked list, and this will use only the routine methods for calloc, free, adding and removing things from linked lists.
Unit Test Plan
  • Add, get_first, remove a thud. Possible errors:
    • Create doesn't
    • get_first() returns null or invalid. (Invalid will likely crash the program).
    • Destroy doesn't, or it doesn't null the pointer provided.
    • Count of thuds doesn't go to 1 then to 0.
  • Add a bunch of thuds, then remove all of them in an arbitrary order. Possible errors are:
    • All of the above, plus:
    • relink in forward or reverse order fails or disagrees as to the order or content.
    • First and/or last links incorrect.
    • Destroyed thud still accessible.

Review:
Aug/05/2006 - 4 hours. This is more like it. I ran thru a normal (for me) coding session, had the usual number of whoopsies (bugs I'm familiar with that are little more than typos). It took longer to code the tests than the product; I'm sure it's cuz I'm well familiar with linked lists, and less with tests for linked lists that I intend to keep around in working order forever.

Programming

Journal: Thuds Iteration 4 and reset

Journal by JetScootr
I'll save you from reading a buncha lame development journal entries.
I figured out I'm going too slowly, with designs that are too simple. Each attempted iteration so far has resulted in almost the complete deletion of the *extremely* limited code from the previous iteration.
I'm actually an advanced programmer, but I was over-compensating for this by doing my XP "baby steps" in the code instead of in the process.
So, I'm gonna delete almost all but the make file, and start over with a realistic design concept, and each iteration will implement a narrower, but more complete portion.
So, for the next first iteration: Create a thud, using a linked list. Destroy it. Unit test will do its own printing of the Thud's existence.
Programming

Journal: Thuds

Journal by JetScootr
Here's the deal: I love to program, been doing it professionally for many years. I'm up to learn some new stuff: artificial intelligence (ai), specifically, artificial life, genetic programming, etc. I also want to learn extreme programming, and maybe convince my coworkers/company, etc to implement at least some of the practices. I know, as a solo programmer, I can't do most XP stuff; but I will try to do what I can. I'm going to post excerpts of my project notebook here, then ask other developers if this matches their experiences in implementing XP at their workplace.
This is done on my employer's computers, but on my own time, after hours so I don't interfere with any other users by loading down the system. The system is an SGI 12000, 8 cpus, running IRIX 6.5. I'm writing this in C. Yes, my employer's policies do allow me to do this. I won't post the code I produce, not cuz it's that special, but because I don't know how the copyright thing works out. I know the code should be mine, but being on the company's computers could result in misunderstandings, so it won't get posted. Sorry.
'Thuds' is the name of the project, and the alife I'll be creating. They'll have a 2.1d world to roam around in, based on a rectangular coordinate grid. Each spot on grid can hold zero or one thud, a variety of zero or more types of plants; a spot will have 'texture', in that ease of travel over the spots will vary, based on 'dirt','water','hill','rock',etc.
Thuds will be able to sense their surroundings and will have a fitness function that determines life, health, death, etc periodically. Thuds senses will be localized and will not be task specific.
Thuds will have a list of functions they can perform. In the first major version, thuds will not be able learn, but the goal is to produce a version where the thuds do learn. Thuds will have dna that describes their initial conditions and potentials, and they will reproduce sexually, with some mutation.
Project Plan:
XP methods will be used, such as one person can do alone. The Values that I will try to apply are communication (with myself, forward and backward in time), Simplicity, Feedback, Courage. Principals: Traceability, Education, Demonstration, Elegance. Practices: RCS will store code versions, Unit tests will be coded first, automated, and will all pass before each iteration is complete.
The code will be kept neat; I will also document all aspects of this project as I develop it. Energy - I will work on code only when my mind is properly attentive and I can focus and enjoy the work. Iterations will be fairly short by XP standards - I'm going for one per programming session. A session for me will be 1-4 hours in length.
For each iteration, I will document: The story, design, unit test plan, review of performance, list of code, tests and data produced.

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...