Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Communications

The Future of Optical Fibre 139

An anonymous reader writes "An Australian researcher has come up with a novel way of developing optical fibres. Steven Manos, a researcher at the Optical Fibre Technology Centre in Sydney, Australia has developed a method of using genetic algorithims for discovering optimal designs of optical fibres. An article on his work had this to say "The problem with designing optical fibres is starting with a specific set of criteria and then coming up with a design to fit this. The computer program developed by Manos, which is run on supercomputers, does this by mimicking the process of evolution. The computer program combines two patterns to create a third fibre 'offspring', which Manos described as "similar but a bit different". This process is repeated thousands of times with the 10 designs best suited for the particular application chosen to 'breed' again." Another case of "When in doubt, use brute force"?"
This discussion has been archived. No new comments can be posted.

The Future of Optical Fibre

Comments Filter:
  • by haluness ( 219661 ) on Tuesday June 22, 2004 @09:08AM (#9493919)
    I'd rather not think of the method as brute force. Ok, its not like a design from first principles, but its still way to search the parameter space without having to test all coimbinations of parameters
    • by neilmoore67 ( 682829 ) on Tuesday June 22, 2004 @09:19AM (#9494014)

      I'd rather not think of the method as brute force.

      Well said. Brute force would be enumerating every possible optical fibre and then testing them.

      This method is more subtle and converges to a close-to-optimal solution with less computer power having to be applied.

      • by Anonymous Coward
        Wrong wrong wrong.

        GA is not guaranteed to converge to a "close-to-optimal solution". With results from a GA you do NOT know the solution is optimal. The **hope** is that by wiggling around somewhat in parallel with your genetic inputs that you have a better chance to find a global optima. That is just a hope.
        • Yes, you're quite right, I probably shouldn't have put it in such broad terms. Although I didn't claim that it would be optimal, or that you would know it was optimal.

          I suppose though that there might be a theoretical upper bound on the performance of a design (think about a spaceship, once you're close to c you're doing demonstratably well), so you at least know how close you are to the very best design

          • That's actually a very accurate description of how many genetic algorithms find their stopping conditions--either run for X generations, or stop when within a certain threshold of the theoretical maximum (or estimated maximum).
            What can be tough about such stopping conditions is making sure you have a good scoring function so that the optimal solution indeed scores best, and that you know what that score should be (though, of course, not how to get there, or the alg is pointless).
      • But it does place the burden on the test criteria to remain static. If it changes for some reason, I would suspect that you'd have to start with your original set of candidates to avoid having "evolved" into a narrow subset which doesn't contain the theoretically ideal fibre for the new tests.

        In other words, consider the test criteria equivalent to Nature and the various fibers as types of animals. If Nature changes a few times, an animal ideal for the latest natural conditions might have been "breeded" ou
        • "In other words, consider the test criteria equivalent to Nature and the various fibers as types of animals. If Nature changes a few times, an animal ideal for the latest natural conditions might have been "breeded" out of existence by a previous change in Nature."

          That's why GA systems often randomly introduce brand new genomes to a population. It supplies some entropy to "tunnel" out of local minima.
    • Ok, its not like a design from first principles, but its still way to search the parameter space without having to test all coimbinations of parameters

      So-called first principles are explanation, not design tools. In other words - guess what? - nature is still surprising even if it can be "explained" by what we already know. We can explain stuff. It's the construction that we don't understand so well.

    • by N Monkey ( 313423 ) on Tuesday June 22, 2004 @09:23AM (#9494054)
      I'd rather not think of the method as brute force.
      I'll agree with that. Brute force searching would go though all the parameters a la ..

      for(parameter1 = min limit ...)
      for(parameter2 = min limit ...)
      for(parameter3... )
      etc....
      Evaluate(parameter1, param2, ....)

      Genetic algorithms try to limit the search space by starting with "probably good" sets of parameter values and trying to generate other "probably good but hopefully better" parameter combinations.

      It won't necessarily find the absolute best set of parameters but it might find some reasonable ones.
    • Not so. This method is considerably slower than bruteforce because it relies on a randomization seed at each iteration.

      Genetic algorithms can only be shown to find optimal paths quickly when the the path is already known.

      Genetic algorithms are re-discovery algorithms. They are never well applied to situations with an unknown search space.

    • I think as this method becomes more popular it will displace the older method of finding the most mathematically perfect solution and designing from that.

      In other word instead of fudging designs while you wait ten years for a mathematician to find the equations for the perfect wing, you just get a computer program to 'evolve' one for you. I'll bet this is what boeing and airbus already do.

      Sadly this will leave most applied physics mathematicians out of a job. Danm computers!! Taking our jobs and our women!!
      • Sadly this will leave most applied physics mathematicians out of a job. Danm computers!! Taking our jobs and our women!!

        Compouters don't need women so... First they take the jobs. Then they take the power. Then WE get the women.

        Sounds pretty sweet to me!

      • This comment resounds with me.

        There are many things we can create a blackbox algorithm which represents the first principles as good of better than algorithms based on a full understanding of the first principles.

        Example:
        At a division of Honeywell, we worked on "flattening the response curve" of the crt.

        to do this the engineers developed complicated models based on energy output, gamma curves, crystal variations, etc and used these models to sense the point to point deviations across the crt and create a
    • I agree. Brute force on its own is inefficient and predominantly useless.
      Even at tasks it can be applied to - like key cracking etc, it is still practiacally uselless without a bit of intelligence build in.
      Even the best prime generators dont doa brute iteration through all integers > 0 to infinity - that would be pointless. You need to know where to look as well, or your waisting your time.
      I hope as computers continue to advance we dont forget this and simply rely on computing power. Because no matter ho
    • The problem with GA (and real life genetics) is that they get stuck in "saddle points" where further deviation seems to "get worse" in either direction. This can be mitigated somewhat by an intelligent choice of starting point, but you can never trust that the solution is even close to optimal. Even worse, GA with a human-selected starting point pretty much rules out finding a really novel, counter-intuitive solution.

      A true brute force, exhaustive search of all possible parameter values may take longer, bu

  • Does his for lack of better words breading of fiber networks, did I understand this right, take into account some immovable obstacles?

    Or is this process used to design the cable itself?
    • Re:Question? (Score:2, Informative)

      by PingKing ( 758573 )
      It is specifically referring to the fabrication of fibre itself.

      Optical Fibre Technology Centre:
      http://www.oftc.usyd.edu.au/?section=fibre [usyd.edu.au]
    • Can I get breaded fiber at Long John Silvers now, or that breading only available at select locations? I wonder what kind of batter they'd use...

      Sorry to not answer the question (it's designing the cable *fibers* themselves - hence "design of optical fibers" in the article description and the linked article all about optical fibers) - but the incorrectly spelled "breeding" makes me laugh. It's even more amusing when you *read the article* and see the word "breeding" used in the caption under the large ph
  • But he's already started off with intelligent design!
    • A few months back, there was a slashdot post about an evolved circuit board designed to perform some algorithm or possible in the best possible and most efficient way.

      One problem, though. The circuit board only worked *exactly* where the board was, because it actually made part of one of the wires behave like an antenna and it relied on the specific electromagnetic field in that location to run the circuit. As a result, moving the circuit board made it no longer function.

      I wonder whether the same result m
  • PDFs from Manos (Score:5, Informative)

    by antic ( 29198 ) on Tuesday June 22, 2004 @09:14AM (#9493970)

    There are some interesting PDFs of papers co-written by Steven Manos available including these two:

    I'm not going to pretend that I know exactly what's going on, but the first of those two is worth looking at if you have even a passing interest. The second looks to be a little more towards the "deep end".

  • Much Better (Score:5, Informative)

    by irokie ( 697424 ) on Tuesday June 22, 2004 @09:15AM (#9493987) Homepage
    This is a much better example of the application of Genetic Algorithms [wikipedia.org] than the story that was on slashdot the other day (can't find a link, the one about Formula One racing).

    in this case they have a very specific set of criteria.

    it didn't however mention in the article how they're testing the designs (did it?)...
    and are they actually manufacturing any of the designs that have come from thiss yet?
  • by Mateito ( 746185 ) on Tuesday June 22, 2004 @09:16AM (#9493995) Homepage
    .. would breeding be regarded as "brute forcing". :)
  • ...wireless!! ;)
  • What the... (Score:2, Insightful)

    by eddy ( 18759 )

    Another case of "When in doubt, use brute force"?

    Evolutionary search isn't "brute force", you id... At least not for meaningful definitions of 'brute force'

    Brute force would be starting at one end of design space and evaluating each design in turn.

    • Oh, evolutionary selection relies very heavily on brute force - just ask any species that's fallen behind.
      Oh wait, you can't. All the other species have brute forced them out of the environment...
  • by jdrugo ( 449803 ) on Tuesday June 22, 2004 @09:18AM (#9494004)
    ..as they don't search the state space exhaustively. Going through all possible combinations of parameters would be brute force, but in this case, as the parameters are real-valued, this is even impossible (if ignoring the possibility of quantisation)

    Evolutionary Algorithms provide informed search as they perform competition among the individuals (each representing one possible solution) in the population. Their performance is way above exhaustive search techniques (which _are_ brute force) but below classical search techniques. In this case, however, such classical techniques cannot be applied as the problem space is not well-defined.
  • No, Taco, No (Score:5, Insightful)

    by neoshroom ( 324937 ) on Tuesday June 22, 2004 @09:18AM (#9494006)
    Another case of "When in doubt, use brute force"?"

    No, Taco, No.

    From the 'brute force' [wikipedia.org] entry in Wikipedia:

    In computer science, Brute Force, sometimes called the Naive Method, is a term used to refer to the simplest, most intuitive, most spontaneous, and usually most inefficient methods of accomplishing a task.

    This is exactly what a genetic algorhthem is not. If you have a million numbers brute force would be to go from the first to the last in order. Using a genetic algorhythem provides a shortcut though Design Space wherein you need to try far fewer combinations in order to come to a successful result.

    C'mon Taco, of all people, you should know this!
  • by Timesprout ( 579035 ) on Tuesday June 22, 2004 @09:19AM (#9494012)
    With all the weirdo animals the Australian continent has produced I guess this program will produce some highly interesting results. I cant wait for the announcement that a pattern resembling a Duck Billed Platypus is ideal for streaming Digital TV.
  • Certainly not.

    Genetic algoroithms are simply another form of optimization algorithm, just like Simulated annealing, Ant-Cology optimization, just to name a few. Each variety has its strengths and weaknesses for different search spaces and genetic algorithms have there place. These often have nature related names because nature is an excellent optimizer from which we draw inspiration.

    If you want to talk brute force, try an exhaustive search of complex high-dimension, continous, real valued parameter sp

  • Repeat after me: There is no general solution to the global optimization problem.
  • Here is the research paper [usyd.edu.au] published by Manos on the topic.
  • The computer program combines two patterns to create a third fibre 'offspring'

    We must kill it before it develops language skills!!

    • I imagine - what if these 'offsprings' get growing for real?
      It's just a matter of time and technology. So a device will generate fiber optic cables and they'll spread all over the world, plugging into every device they meet in their way .. :D .. yeah, right...
      [sounds like a scenario for some crappy movie or smth]
  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Tuesday June 22, 2004 @09:23AM (#9494048)
    Comment removed based on user account deletion
  • I am... Torgo..; I .. polish the. .. fiber while the Master... is away... There is no way... out of.. here, the fiber will.. go dark ... soon, there is no way... out of .. here... etc, etc.
  • Seems yet again there is a better way to deliver data over fiber, which doesn't surprise me. Does anyone remember when Tyco Corp. (http://www.tyco.com/ - the same guys responsible for a nasty embezzlement scandle) used to string fiber across the oceans like mad. And then someone figured out that you could send 100x the amount of data across the same cables...
  • Brute force? No way? (Score:5, Informative)

    by carldot67 ( 678632 ) on Tuesday June 22, 2004 @09:29AM (#9494098)
    Genetic algorithms are computational shortcuts that are used to very quickly find minima in complex multiparameter functions.

    Suppose you wanted to find the lowest value of f(x)=sin(x) where x is from 0-360. (OK we all know its at x=270 but hear me out) - you can do it a couple of ways:

    1. calculate sin(x) for all 360 possible values of "x" or
    2. calculate sin(x) for (say) 20 values of "x".

    Statistics says approach 2 will give you a couple of promising results, for only 1/18th of the effort. Now "breed" another 20 from the 6 values of x for which sin(x) were lowest, say 190, 210, 212, 260, 278, 290. This "next generation" gives sin(x) values whiach are closer to zero. Take the best 6 again. After three generations you are *close* to finding the values for "x" that give you sin(x)=0.

    So systematic examination takes 360 tries and the genetic shortcut takes 60 tries - about 17% of the computational effort.

    Now imagine a function a bit more complex; some mad multivariate affair like the wave equation. Each variable becomes a "gene" in the above "breeding program". All the time we are looking for parents and offspring that *tend* towards the answer we are looking for. (We also chuck in some unrelated parents too, since inbreeding can be bad - a tip stolen from Monte Carlo techniques [which see]).
    The computational savings from GA, GP and MC techniques are potentially huge (as in orders of magnitude) so long as you dont care that:

    a) The answer is not 100% exact
    b) Some alternative minima are missed

    • But only problem, just as parent said, is that

      b) Some alternative minima are missed

      A friend of mine got a job to work on genetic algorithms, in an academic institute. Being an engineer, he asked to be moved to another position 3 months after.

      His explanation was very short: in GA you look for problems and you try to prove that they could be solved by this method.

      Actually, all methods used in engineering were invented to solve some problem; not vice versa. Ok, maybe they not all of them, i canno
    • Your comment was the best summary of the technique, Carldot67. This is offtopic, but please see my previous slashdot post, [slashdot.org] because someone should do this.
      • The approach described by Darth_Cider has technical merit. There already exist screening technologies based on arrays of proteins, ligands, antibodies, biomimetic polymers and microfluidics platforms and it is easy to see how this might be extended into the field. Production of the raw materials on a grand scale is even quite cheap.

        The problems arise with tooling (complicated), instrumentation (expensive) and reagents (twitchy) needed to cleanly detect signals. Oh, and a degree in biochemistry for the user
        • Very well said, Carldot67. Thank you for the laser beam exposition of the issues. I read just today in a popular science article that 99% of Earth's micro-organisms have not been catalogued. You really get this idea, and I'm thankful for the resonance.

          Finding a single useful organism could have tremendous impact. Consider the cost of a lottery ticket and the odds of payoff. A microarray would look like a lottery ticket in size and shape and ought to cost slightly less than a dollar to manufacture. QED.

          It
    • ...so long as you dont care that... b) Some alternative minima are missed

      The problem of local minima is often significant. A good analogy is real genetics - each species has evolved into a "local minima" for likelihood of extinction. If the wings on a given type of butterfly become slightly larger or smaller there will typically be a survivability penalty of some kind, and wing size has stabilized at the optimum for that species. But look at the difference in possible local minima: in one case it res

      • ortholattice of course has a point although I confess I got a bit lost with the worked example. GA and it's ilk are there to produce reasonable answers to otherwise computationally intractable problems. The approach is merely dangerous if the results are treated as gospel and not subjected to the same intellectual rigour as any other testable hypothesis.
  • What is the purpose of this? Specialized interconnects? As far as I know, there is already massive overcapacity for telecom fiber networks.
    • While I usually subscribe to the "more is better" school of thought, I'm also wondering what they're optimizing for. It seems like single mode fiber should be good enough for just about anything - it has a theoretical capacity of many terabits per second, with a useable range of about 60 miles. The price of cable is usually dominated by the protective covering they put around it, which is in turn dominated by the price of the backhoe to install it, so I don't think cost is much of an issue.

      The article v

  • by G4from128k ( 686170 ) on Tuesday June 22, 2004 @09:36AM (#9494148)
    For exploring real-valued phase spaces, one solution is to combine a GA with a classical hill-climber. A hill-climber evaluates the local gradient (the partial derivatives of fitness with respect to the independent variables) and then makes a directed adjustment of the solution in the direction of better performance. Hillclimbers can reach optima in floating-point spaces very quickly, but tend to get stuck on local solutions.

    GAs are great for jumping out of local optima to find new realms of the solution space, but don't converge as quickly on the neighborhood optima. So the combination of a GA with more classical optimzation can work well.
    • I would argue that though GAs are better than simple hill climbing they are far from being "Great" at getting out of local optima. In fact, a lot of the theory surrounding GAs has to do with how to avoid exactly that since the basic GA of some mutation combined with splicing for reproduction tends to get stuck extremely easily being limited almost entirely to the values represented in the original population. Evolutionary algorithms might be a better choice to pair with hill climbers. Evolutionary algorithm
  • I would not be surprised if this is the way our own brain works when figuring out problems. Raw ideas form and mutate and are tested - sometimes we are conscious of it, many times not. It's the way species evolve - it only makes sense that the same logic is built within our own brain.

    Of couse, I am also a big proponent of the idea that evolution gave humans the greatest gift of all - the ability to self-evolve ourselves.

  • by kjba ( 679108 ) on Tuesday June 22, 2004 @09:54AM (#9494327)
    No other algorithm can come up with a design for optical fibres that are cheap to make and transmit data at a high rate, Manos said.

    How can anyone make a claim like this? Just the fact that one can't think of any other algorithm doesn't mean no such algorithm exists. For many problems that can be solved by genetic algorithms, other (problem-specific) algorithms exists (or may exists) that are way more efficient. The nice thing about genetic algorithms is that it is a standard tool that often works, not that it is an exceptionally smart way of doing things.

    • by geeber ( 520231 ) on Tuesday June 22, 2004 @10:10AM (#9494469)
      Well, actually lot's of algorithms exist for designing optical fiber, and they do it efficiently and very accurately. I use a number of in house proprietary programs for designing optical fibers all the time. And I can tell you we don't waste time messing around with GA's

      So why don't you hear a great deal about such algorithms? Well, for one, they don't have cool names like "Genetic Algorithms". Also, they are highly prized and considered extremely valuable intellectual property for the companies that actually make optical fiber. We are not going to publicise all the details the most fundamental design tools of our business.

      GA's are not the future of optical fiber. They are, however, excellent for generating academic papers, which in turn are highly useful for getting tenure.
  • by szquirrel ( 140575 ) on Tuesday June 22, 2004 @09:55AM (#9494345) Homepage
    This process is repeated thousands of times with the 10 designs best suited for the particular application chosen to 'breed' again." Another case of "When in doubt, use brute force"?

    More like another case of computer science being fascinated by meat.

    Remember when neural networks were the next big thing? Everyone was applying them to everything, whether or not it made sense to solve the problem that way. It's neural! Just like our brains! Our brains are smart, they will make our computers smart!

    I'm sure genetic algorithms will eke out a useful place in the computer science toolkit, I just doubt it will be as broad as the current fashion of applying them to everything from optical fiber to race cars [slashdot.org] to compilers [slashdot.org].
    • Actually the combination of Genetic Algorithms (or at least evolutionary computing) and Neural Networks make a really powerful combination.
    • ..so maybe plain 'ol meat isn't so bad.

      While you're demonstrating ignorance, there is a lot of very promising work going into applications of neural networks to control systems and the broader field of AI in general. The problem with neural networks is that you need large numbers of processors to do some of the more complicated nets in anything approaching real time. Your brain has several billion little processors massively interconnected.

      Up until very recently with the advent of large scale FPGAs, this
  • Structured Wild Ass Guessing.
    At least they are using a computer to do it.
  • by Anonymous Coward
    Manos... the hands of bitrate.
  • by shaka999 ( 335100 ) on Tuesday June 22, 2004 @10:30AM (#9494659)
    I've written my open GA for doing circuit optimizations and it works very well. The thing I love most is that they are so simple to write. There are things you can do to speed up convergence but the basic algorithm is very straight forward.

    The difficult thing is how to score individual trials. I don't know how many times I've checked things after a overnight run and found that my results aren't what I expected. Pretty much everytime this comes down to how I've scored a trial. Just remember you get what you ask for.

    For a circuit example, suppose I ask for a certain power comsumption and speed but I overstate the speed goal. Because I'm so far off the speed goal the power will largely be ignored. There are easy ways to tweak this but the point is...again...you get what you ask for.
  • on the next generation of fiber so i can upgrade my christmas tree.

  • Steven Manos... I guess the "fate" of fibre is in his "hands." ;)

  • There are a few posts here to the affect that "optimization is limited completely by the original machines." When I was doing GAs, we would select the top few performers, and:

    1. "Breed" them with each other.
    2. "Breed" them with totally random data.

    No matter how well your select your original machines, there's practically always room for improvement (otherwise, why use a GA in the first place?) Unless you are REALLY good at selecting your first few machines, the random data really is powerfull. Case
  • The computer program combines two patterns to create a third fibre 'offspring', which Manos described as "similar but a bit different"

    ...as compared to the "similar but identical" results normally achieved?

  • Another case of "When in doubt, use brute force"?

    No. Brute force would be making a list of all possible designs, removing the ones which did not fit the requirements, and sorting by price. This method explores only a small subset of all possible designs - while it won't find the theoretical best possible design, it'll find one good enough, and it'll do it in a timespan shorter than the age of the universe.
  • I like mine round, long, thin, and to the curb.

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...