Become a fan of Slashdot on Facebook


Forgot your password?

F-22 Avionics Require Inflight Reboot 587

Posted by chrisd
from the ctrl-alt-boom dept.
An anonymous reader writes "The Atlanta Journal & Constitution is fronting a lengthy piece on the USAF's new F-22 and its upcoming shootout with the existing fleet of F-15's & 16's. One line in the article really jumped out at me: 'When avionics problems crop up now, pilots must restart the entire system as if rebooting a personal computer.' I did some googling, and this is about as much as I could find: The hardware backbone for the system is the Hughes Common Integrated Processor, which, in turn, appears to be built around the Intel i960 CPU. I couldn't find a name for the operating system, but it appears to be written in about one and a half million lines of Ada code; more on the Ada hardware integration and Ada i960 compilers is here. Any Slashdotters working on this project? If so, why do you need the inflight reboot? PS: Gamers will be interested to learn that nVidia's Quadro2 Go GPU and Wind River's VxWorks Operating System are melded in the F-22's Multi-Function Display."
This discussion has been archived. No new comments can be posted.

F-22 Avionics Require Inflight Reboot

Comments Filter:
  • by putaro (235078) on Monday July 22, 2002 @03:14AM (#3928805) Journal
    Long ago a friend of mine was working on an add-in computer driven compass for the F-16 for a big defense contractor. She called me up looking for graphics algorithms (she was the junior engineer on the project). She was fighting with her boss who wanted to install an FPU to speed up their circle drawing routine (this drew the compass rose onto the screen) while she thought they could speed it up by switching algorithms. Why did her boss want an FPU - well, because software sine and cosine routines were too slow. (BTW, the circle was always the same size and just the tick marks actually moved).
  • F-22 "avionics" (Score:4, Interesting)

    by sluggie (85265) on Monday July 22, 2002 @03:21AM (#3928832)
    Sorry, but if you have to reboot the ENTIRE avionics system of a F-22 you're fucked to say mildly.

    This plane is always in a controlled stall, the movements of the rudder to prevent it from crashing are calculated every second this bird flys, the pilot just decides in which directions the plane goes, but the task of keeping it up is left to the CPU.

    So, if you just "reboot" this sucker for a second the plane would plummet like a stone, no matter how strong it's pushed forward by the engine or what the pilot does.

    What I can imagine that the pilot would have to restart some none vital components of the main computer.
    Such as the timing of the green/red flashlights or his seat heating. ;)
    Even restarting the RADAR/TARGETING unit would be ok, BUT DO NOT SWITCH OF THE AVIONICS ON THIS BIRDY!

  • by Deton8 (522248) on Monday July 22, 2002 @03:22AM (#3928837)
    In 1997 the Mars Pathfinder probe had a problem with VxWorks and priority inversion. Perhaps the F22 is having something similar -- whenever you have a RTOS, the designer must try to anticipate when it's safe to block real time interrups and when it isn't. I don't know anything about the F22, but it's easy to imagine that it has hundreds of input sources with all sorts of latency requirements. AFAIK, it all comes down to some humans trying to balance these conflicting needs. Clearly they don't always get it right.
  • Redundant (Score:4, Interesting)

    by Perdo (151843) on Monday July 22, 2002 @03:27AM (#3928850) Homepage Journal
    The flight control computers are 7x redundant and distributed throughout the airframe. It's the new radar and v3.0 combat avionics that need "rebooting"
  • by Louis Savain (65843) on Monday July 22, 2002 @03:45AM (#3928895) Homepage
    Why do you need the inflight reboot?

    Because that is the nature of complex algorithmic systems. An algorithmic system is temporally inconsistent and unstable by nature. Using the algorithm as the basis of software construction is an ancient practice pioneered by Lady Ada Lovelace and Charles Babbage. It is the fundamental reason why dependable software systems are so hard to produce.

    There is something rotten at the core of software engineering. Software functionality should not be fundamentally different from hardware functionality. Software should emulate hardware and serve as an extension to it. It should only provide the two things that are lacking in hardware: flexibility and ease of modification. The only way to solve the reliability crisis is to abandon the bad practice of using algorithms as the basis of software construction and to adopt a pure signal-based paradigm. More details can be found at the links below:

    Project COSA []
  • i960 in PC's (Score:2, Interesting)

    by Ratso Baggins (516757) on Monday July 22, 2002 @03:46AM (#3928897) Homepage
    More than 10 years ago I first saw a i960 dev board, and I thought "YUM! I can't wait for PC's to use them..." But they haven't. Anyone have any valid conjecture as to why?
  • by Anonvmous Coward (589068) on Monday July 22, 2002 @03:57AM (#3928921)
    "Now this is more of an MS bash... people have come to expect system failures, and I've read admissions that 5-9's uptime is just too difficult and expensive a goal, and so-on, and of course this mostly points to MS desktop and server software"

    That's an interesting read, my company chose Windows 2000 for stability as desktop machines, and we're doing fine. 19 desktops and laptops, all running 2k. My job is to maintain them, and I find way too much time to post on Slashdot. ;)

    We've also got an NT4 webserver running IIS, and it's been up for 3 months. It would have been up longer except I had to shut the box down to move it.

    I'll tell you something, it was a huge relief to go to 2000 from 98. Nobody bugs me about anything anymore. We have computers running all weekend processing video data. We haven't had an 'over the weekend crash'. We'll have 4 video files going at once, two per processor, and they'll all be done by Monday. As you can see, we beat our machines pretty hard sometimes.

    *Thought it'd be nice for you to hear from somebody who's had good experiences with MS for a change.*

    I've drifted off topic a bit. Sorry. The point I'm basically making is that Windows 2000 is a fine OS and would probably be up to the job, at least run-time wise. I know that comment's going to draw criticism, but oh well. I've worked around a ton of these machines for the last two years and you're not going to change my mind about it. Heck, I have a computer in my bedroom right now capturing TV shows as a home-brew Tivo. Hasn't been rebooted in over a month. Not bad given how buggy the TV drivers are. Heh.
  • Re:Ada ? (Score:5, Interesting)

    by Kysh (16242) on Monday July 22, 2002 @06:16AM (#3929165) Homepage Journal
    > This means the developers were forced to use
    > Ada, but why ? To me, it seems some suits think
    > it's especially "safe" for some reason, does
    > anyone know more about that ?

    Ada is especially safe. It is, in fact, one of the
    VERY few safety critical environments you will
    find. It's very simple- A safety critical program
    must never exit and give up control functionality
    entirely, no matter what happens. There are many
    things that you can do with C/C++/Java that will
    cause a crash unrecoverable by the system.

    Ada is designed to inherantly prevent a programmer
    who follows the appropriate standards from writing
    a program that can just crash and exit. As long as
    every possible exception has a handler, an Ada
    program can be written that will not crash.

    > But I think you can try to make a programming
    > language as "safe" as you want, it won't prevent
    > you from implementing bugs, it just causes a
    > false sense of safety instead which can be even
    > more dangerous, IMHO.

    Bugs are universal. But bugs in a C program can
    cause the controlling system to shut it down with
    prejudice (Sig 11 and others), and it doesn't
    offer the automatic safety nets Ada does. Can you
    write safety critical software in C/C++/Java?
    Certainly. It's all a matter of methodology. Ada
    enforces the methodology, which is why people hate
    it. They can't do cute, horrible hacks like they
    can in C/C++, and Ada requires explicit
    specification.. Ada has specific standards of
    implementation for software, and a good inherant
    design. It is designed, from the ground up, as a
    'safety critical' language, and for the most part
    succeeds on its own merit.

    I do understand the widespread animosity towards
    Ada. People don't like the verbose, very specific
    code. Progammers often want to bend the langauge
    over their knees and perform horrid hacks that
    make reasonable people blanch in fear, but Ada
    doesn't really allow that. Programmers are often
    forced to learn Ada in structured learning
    courses, and forced to read the Ada RM. They end
    up hating it because of the language and
    terminology used, because of the verbosity of the
    language, because of some of the difficult
    concepts of Ada, etc..

    But it really is a fine language. (I'm sure many
    people will disagree with me without really having
    an objective or informed viewpoint, but that's
    just how it goes)

  • Re:Ada ? (Score:4, Interesting)

    by Jamie Zawinski (775) <> on Monday July 22, 2002 @08:03AM (#3929334) Homepage

    Ada is especially safe. It is, in fact, one of the VERY few safety critical environments you will find. It's very simple- A safety critical program must never exit and give up control functionality entirely, no matter what happens. There are many things that you can do with C/C++/Java that will cause a crash unrecoverable by the system.

    Ada is designed to inherantly prevent a programmer who follows the appropriate standards from writing a program that can just crash and exit. As long as every possible exception has a handler, an Ada program can be written that will not crash.

    In what way is Ada better than Java in this respect? I only know a little about Ada, so this is a serious question. My understanding is that Ada and Java have very similar safety goals (especially with respect to exceptions) so I'm curious about what you think Ada gets right and Java gets wrong.

    It should be the case that the only way for a Java program to "crash" is if there is a bug in the runtime library or hardware interface: the same kinds of problems can of course affect Ada.

    (I've got a lot of problems [] with Java, mind you, but I'd never say it was "too lenient"...)

  • Re:F-22 "avionics" (Score:5, Interesting)

    by Zathrus (232140) on Monday July 22, 2002 @08:20AM (#3929376) Homepage
    I for one don't care for fly-by-wire. Perhaps I'm old fashioned

    Well, sure... except that for modern fighter aircraft that's simply not viable. What the original poster was trying to say was that the F-22 is not inherently stable in flight (the AE's out there will now point out how minutely incorrect that statement is). If the flight control software goes wacky, you will be unable to fly the plane -- even if it was good ol hydralics and pneumatics.

    The F-22, like a lot of newer jets, has totally integrated flight systems. The ailerons do not work seperately from other control surfaces, particularly the directed thrust system. A human trying to control all of this at once would be overwhelmed, and have considerably lower flight capabilities than a fly-by-wire system.

    Another poster pointed out the pilot intenionally doing bad things to the aircraft - shifting all the fuel to one side, opening the weapon bay doors on that side, etc. which threw the jet into cartwheels at 45k feet. Once the pilot released the controls the jet self-stabilized. That's pretty damn impressive. Ok, sure, with fly-by-wire you're pretty well hosed if it doesn't do this because you don't have a "real" concept of what the plane is doing and reacting.

    Fly-by-wire is becoming standard on large commercial jets too. I suspect it'll be a long time before it's common place on your small, private plane though -- especially since I can't imagine a single engine prop ever being designed to be "inherently unstable" in the air :)

    One of the most impressive things I've seen a Raptor do so far (on Discovery Wings, of course, heh) is fly backwards... jet is flying straight and level, pilot pulls the throttle all the way up and the jet actually goes into a "controlled stall" and moves backwards (or so it appears visually) for a short distance. Hell if I know if it's useful in combat -- but nifty to the layperson.
  • by shaldannon (752) on Monday July 22, 2002 @08:49AM (#3929479) Homepage
    I'm not sure Ada is small and clean either, and I had 4 years of it at Auburn University since that was the required language for data structures and algorithms classes. They alleged that Ada was bullet proof as a language...that it never crashed, dumped core, etc., so made perfect sense for use in avionics. Mind, we were getting government contracts, so it's entirely possible that they were spouting the party line.

    At any rate, my observations are as follow: First, the Ada syntax was based on the Pascal syntax (they state this in the textbooks). Second, it is almost as anal as Java. Third, you may write a program in Ada but if you use Gnat to generate your code, it's getting translated to C anyway, so theoretically your bullet proof code just developed some vulnerabilities.

    I guess Ada has its uses, but I heard recently that even the DoD has stopped requiring its use.
  • Re:Ada ? (Score:3, Interesting)

    by shaldannon (752) on Monday July 22, 2002 @09:03AM (#3929529) Homepage
    I had the (dis?)pleasure of learning Ada as the required language in 4 years at Auburn University's Computer Science department. While what you say is quite true (from my observation) my two biggest objections to it were verbosity and strong typing. It's really, really annoying to have to convert, say, and int to a float through a function call. I'm not even asking for a Perl-style eval{} here...I just want the ability to declare something as an int with value 3, divide it in half, and reassign the value back so it is now a float 1.5....I also want the ability to get a newline without having to type in 'Newline;' or 'Ada.Text_IO.Newline;'. For what it's worth.

    Until that time...I write Perl code all day long in web apps (nobody dies if your web app goes kaput).
  • Re:Why C? (Score:4, Interesting)

    by Erich (151) on Monday July 22, 2002 @10:13AM (#3929887) Homepage Journal
    People act as though C and C++ are the top of the language evolutionary scale--sheer nonsense. C and all of the "C-alikes" have held back decent software engineering now for decades...we get to enjoy buffer and stack overflows, pointers wandering through address space and other idiocies.

    You act as though C is responsible for a stack overflow or pointer pointing problems.

    You wanna know something: IT'S THE PROGRAMMER.

    You can write huge applicatations in plain ANSI C. They can run flawlessly. As long as you use good programming practices and have good programmers.

    Excepting buggy compilers or libraries (very rare in my experience), when you write something in C and it doesn't work, it's your fault. C is very simple, elegant, and deterministic. For examples of C programs that work very well, see UNIX OS kernels, most of the system tools on UNIX, and especially TeX.

    You can write perfect programs in some "more modern" languages ("safe" languages like Java) that will crash, because the environment is so complex that many environments are buggy. This is unacceptable. Not only that, most of these languages aren't any better than C as far as memory management (That's why all the Java programs I see crash with "NullPointerException").

    These new languages, however, do increase the overhead of a program running, to make things slower. As a computer engineer, I do like that feature, as it means that people will go out to buy more complex hardware.

    There are some programming languages that really do have features that help write very very stable, unbuggy code. I would say ADA, ML, and LISP fall in these catagories. But even in these languages, the language can only do so much. In the end, your program will only be as good as your programmer.

    We STILL live in the dungeon of ancient functional thought. And sadly, attitudes like yours ("why not just use C?") help promote this.

    Actually, we have gotten out of the use of functional languages like LISP and replaced them with procedural languages like C. Which is good! That's what your computer does anyway. Though most functional languages do a very good job of implementing themselves in a procedural system... stacks are pretty simple things.

    But I bet you're one of those OO people. You think that OO is the greatest thing and that if everyone used it to write their programs, the world would be a fantastic place.

    There's a place for OO languages. They do some things well. Some things they do very badly. And in the end, OO languages are still only as good as the programmer. And they have enough problems and complexities that for things like flight control, they aren't always appropriate.

    Let me tell you a little story. There was once a class that was trying to make a robot arm play ping-pong. There was a camera that could see the ball, and then the software computed where the paddle should be, then was supposed to move there so that the ball would return to the other side. The software was written in a "safe" language.

    When they went to test the robot arm, the ball flew straight past it. The arm didn't budge. They looked at each other, wondering what the bug was, until a few seconds later the arm moved to where it should have gone.

    The problem was the environment. After doing the complex computations, the garbage collector decided it needed to clean up all the memory used for the calculations. Once the garbage collector had finished, the arm was allowed to move, but by that time it was too late.

    And to finish off, let me tell you the one thing that bugs me about most languages: THEY DON'T HAVE BUILTIN MACRO PROCESSORS. Macros in C are the most useful thing about the language, in my opinion. Not having them is a horrible travesty.

  • Re:Ada ? (Score:5, Interesting)

    by foobar104 (206452) on Monday July 22, 2002 @10:34AM (#3929998) Journal
    First, read Kysh's comment. It's better than mine.

    But the short answer is that it's possible to compile a Java program that will exit due to an uncaught exception. For many exceptions, Java forces you to have an exception handler, otherwise the code won't compile. But not for all. Runtime exceptions can send your code straight out the window.

    The idea behind Ada-- I've never done much Ada programming myself-- is that it's not supposed to be possible to compile code that can throw an uncaught exception. The compiler is supposed to prevent you from doing such a thing.

    This doesn't mean that Ada code is always perfect, but it does give you a degree of freedom that you don't get with other languages.

    I did some work about four years ago on a flight simulator project for the DoD. The first stage in the project was to build an unclassified demonstration version of the new sim. Some code related to weapons-- in this case, the AIM-120 missile-- is classified, and can't be demonstrated in an unclassified environment. So what did we do? We just didn't link in that code. (I may have my terminology wrong; I was doing HSI, not code, so I'm just going by what my friend on the other side of the hall told me.)

    With any other environment, C or Java or whatever, that would have resulted in a fatal runtime error. But Ada doesn't let you have runtime error situations without exception handlers, so when it encountered the missing chunk if AIM-120 code, the sim just dropped into the exception handler-- which basically said, ``never mind, everything's fine''-- and kept right on going. The sim dropped a couple of frames every time you fired a missile, but other than that, no problem.

    I've gotta say that I found that pretty cool. I mean, the sim just kept on going, after it found that a huge chunk of important code was simply missing! Neato!
  • Re:Ada ? (Score:1, Interesting)

    by Anonymous Coward on Monday July 22, 2002 @10:58AM (#3930156)
    One HUGE advantage that ada has over java is that it is mostly hardware-independent (although of course there are always system-dependent packages and stuff) without having the overhead of that silly bytecode interpreter. Most code written for the military in ada is going to be for realtime embedded systems (either hard or soft). If you went the java route, that would mean having to choose a CPU and OS that runs the java interpreter, which would probably be some huge bloated OS that has way more functionality than is desirable in an embedded system and would have very high CPU requirements. In addition to that, the operating system would have to be one that handled real-time priority applications RIGHT. No pausing the system for 3 seconds while writing something to swap. Even THEN, you'd still probably have to write some hardware interface code in C or ada or a similar language for the displays and stuff. Ada, on the other hand, is designed from the bottom up to be used in real-time embedded systems. It compiles to native code. There are ada compilers for just about any processor you care to name. Ada is very type-safe. Ada95 (the most recent iteration of the language) is even somewhat object-oriented. Java is a good language, it just isn't good for systems with hard real-time requirements like aircraft avionics control systems.
  • Re:Ada ? (Score:2, Interesting)

    by arrow (9545) <mike@damm.PERIODcom minus punct> on Monday July 22, 2002 @11:11AM (#3930249) Homepage Journal
    Dosen't Javas licence agreement specificly forbid its use in nuclear power plants, dams, weapon control systems, etc.?

    I can't find good linkage to the discussion I remember, but heres something close: The Java2 Plugin licence ( states "You acknowledge that Software is not designed, licensed or intended for use in the design, construction, operation or maintenance of any nuclear facility.".

  • by rebelcool (247749) on Monday July 22, 2002 @11:46AM (#3930500)
    2 words: Garbage collector.

    The garbage collector in java is an asynchronous type. This means while it is running its collection procedures (which can begin at any time, there is no way for the programmer to control this), processing of the program code halts.

    I had a professor which demonstrated the problem of this in a simple example. Suppose you are designing a robot which can climb and descend stairs. It must monitor sensors and adjust angles of its joints appropriately to go down (quite difficult, really). Now suppose the GC runs halfway through the middle of a step. All processing stops, gravity takes over, robot falls down.

    Same goes for avionics systems, if you're landing a plane, you don't want your HUD display to suddenly freeze as you're descending at several meters per second. You'll descend straight into the ground.

    Hence the reason java puts a clause in its license about no use in safety-critical applications.

  • Re:Ada ? (Score:3, Interesting)

    by _xeno_ (155264) on Monday July 22, 2002 @11:47AM (#3930508) Homepage Journal
    It is possible to let an Exception cause a Java application to "crash", although it usually exits fairly cleanly, or, if the exception occured as the result of an event in a GUI, just continue on it's merry way, with the given function having been aborted.

    The Java compiler forces people to catch any Throwable that does not extend either Error or RuntimeException - assuming that the given exception is noted in the throws clause of the method it's looking at. However, as far as the Java runtime is concerned, any exception can be ignored. (So if you managed to compile against classes that claimed not to throw a given exception and link at runtime against code that does, an uncaught exception can wind up "crashing" the program.) An ignored excpetion just propagates up the stack (well, the stack of called methods), until eventually it gets caught by the root exception handler in, which simply dumps the stack trace and then destroys the current thread - in essence, causing the application to "crash", although it's really just an uncaught exception.

    To prevent that, just

    try {
    // your code
    } catch (Exception e) {
    // Either fix it, or restart, or something
    Generally speaking, Errors should not be caught because they're basically signs of the underlying system getting ready to go out the window. (Except for StackOverflowException which is usually a sign of unchecked recurrsion...)

    Oh, and you should add something to your list of problems - the completely inconsistant and confusing versioning numbers that Sun uses.

    Since as you do complain about the fact that Sun uses Java to mean both the language, virtual machine, and class library, the Java version number is just plain confusing since it applies to all three.

    As an example, when Java went from version 1.0 to 1.1, there were several changes to the language (the addition of inner classes), several changes to the API (a new AWT event model), and changes to the JIT technology backing the virtual machine. This pales in comparison to the absolutely stunning Java 2 release.

    See, when Java 1.2 was released, half the documentation called it "Java 2" - which is understandable, since there were many additions to the default class library (Graphics2D, Collections, Arrays (which adds the qsort that was missing, BTW - it's java.util.Arrays.sort(java.lang.Object[]) - oh, and because Object[] isn't the same as int[] etc, they have special copies of the method for byte[], char[], double[], float[], int[], long[], short[], and of course, Object[].)

    Java 2 - or Java 1.2 - also saw the default JIT be changed to the HotSpot JIT. I think Java 1.3 changed the compiler, as well as adding new classes, and 1.4 changed the language to add an assert feature - involving another change to the compiler...

    Anyway, I still do write Java as my day job, and it's nice to get that off my chest... ahhhhh...

  • by TheStruuus (263229) on Monday July 22, 2002 @12:35PM (#3930894) Homepage
    not bozos, it's the government guidlines. For instance the fuel systems have redundent processor units. when started both are online with the slave electronicly disconencted. Following FAA guidlines dictates that a one strike and your out is enforced. At the first sign of CPU trouble (crash,freeze,any electronic part failing within the system) all inputs and ouputs on the unit are sent to high-z and the other unit takes over. Now the reboot part, the first unit will sit in a frozen state indefintly until it is manualy reset with a POR or full HR. But the plane will fly just fine on the redundent system. In an emergency the pilot can manualy reboot the halted system and it will either start up again (if the inital failure was some glitch) or immidiatly halt again if it was a critical falure.

Comparing information and knowledge is like asking whether the fatness of a pig is more or less green than the designated hitter rule." -- David Guaspari