Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Visualizing Ethernet Speed 140

anthemaniac writes "In the blink of an eye, you can transfer files from one computer to another using Ethernet. And in the same amount of time, your eye sends signals to the brain. A study finds that images transferred to the brain and files across an Ethernet network take about the same amount of time." From the article: "The researchers calculate that the 100,000 ganglion cells in a guinea pig retina transmit roughly 875,000 bits of information per second. The human retina contains about 10 times more ganglion cells than that of guinea pigs, so it would transmit data at roughly 10 million bits per second, the researchers estimate. This is comparable to an Ethernet connection, which transmits information between computers at speeds of 10 million to 100 million bits per second."
This discussion has been archived. No new comments can be posted.

Visualizing Ethernet Speed

Comments Filter:
  • by Siguy ( 634325 ) on Friday July 28, 2006 @09:31PM (#15803385)
    ...I can use my guinea pig as a router?
    • by evanbd ( 210358 ) on Friday July 28, 2006 @09:39PM (#15803417)
      Well, apparently you have to plug the cables into its eye sockets, and cramming more than one into each is probably hard. So more of a bridge than a router, I think...
      • Actually, if you wrap some duct tape around it you can jam a lot of stuff into guinea pigs...
      • The comparison brought up doesn't exactly work for output. It probably can take in bits just as fast as ethernet, but when did eye balls emit bits just as fast?
      • I wonder what it will look like under a DOS attack.

    • Use BNC, it doesn't have the bandwidth for Cat 5.

      Besides, cats eat guinea pigs.

      LK
  • Nice comparison (Score:4, Interesting)

    by Anonymous Coward on Friday July 28, 2006 @09:49PM (#15803447)
    This is comparable to an Ethernet connection, which transmits information between computers at speeds of 10 million to 100 million bits per second.

    Yes, but we have better encoding.
    • Re:Nice comparison (Score:1, Interesting)

      by Anonymous Coward
      A 1024x768 image at 24bpp is 18,874,368 bits. Obviously if this article is correct our brain is doing compression. Say the max resolution of the human eye is 576 megapixels [clarkvision.com] and the max bpp is 48. Therefore the largest image size would be 27.6 gb. This would be a compression ratio of 2765:1
         
      • Re:Nice comparison (Score:3, Informative)

        by SnowZero ( 92219 )
        It's not so much compression as having variable resolution. The center of the retina has a much higher resolution than the periphery. Try to look 20 degrees away from some words and try to read them (without looking back). It takes some practice to focus your attention a different place than where you are looking, but it's not too bad once you get the hang of it. You'll notice that there is very little detail indeed beyond about 5 degrees from the center of focus. The problem with TV and monitor displa
        • .....Overall the human visual system is absolutely amazing, and we have a long way to go to catch up with it......

          Even so, there most smart, intelligent scientists try to tell us that this amazing system came about without the application of a superb designer of great intelligence, but by some random statistical processes.
      • .......Therefore the largest image size would be 27.6 gb. This would be a compression ratio of 2765:1.......

        The cells of the retina do a large amount of very complex signal processing before they pass the optical information onto the brain where it is interpreted. Thus the amount of data the optic nerve must carry is much less than what the visual acuity of the eye receives. Each retinal cell has a sophisticated biological CPU built into it. The compression is nowhere near the same as any digital compressi
      • Hmm... We don't just use variable resolution. We also do the first stages of processing in the retina, so the comparison could be likened to transfer of vector versus bitmap images.
    • Yes, but we have better encoding.
      And modulation too. :)
  • by KerberosKing ( 801657 ) on Friday July 28, 2006 @09:54PM (#15803460)
    I am not sure that thinking of signals from the eye to the brain work the same way as computer networks is very helpful. I don't think that there is the same sort of contention in a nervous system as there is in ethernet. Synapses as we understand them today do not appear to have any sort of collision detection. Neurons may have tens of thousands of other neurons that they are connected to in a many-to-one configuration and the whole process is analog, which is very different than ethernet frames. Also a single ganglion cell may send "10 million bits" of information, but the optic nerve is made of many such cells in parallel. I would not be surprised if our current estimates are wrong by at least an order of magnitude.
    • I would not be surprised if our current estimates are wrong by at least an order of magnitude.
      An order of magnitude which way?

      I'm just trying to figure out if I should have dodged that station wagon full of porn DVDs or not.

    • Synapses as we understand them today do not appear to have any sort of collision detection.

      You don't need collision detection if the connection is end-to-end and one way. The reason why wired and wireless ethernet have collision detection is because multiple interfaces are accessing the same channel. If you have multiple eyes on the same optical nerve, you would need collision detection.

      • by Anonymous Coward
        Just to pedantic here, "wireless Ethernet" does not use collision detection (CSMA/CD). it uses collision avoidance (CSMA/CA - i.e. 802.11), collision mitigation (CDMA - i.e. Navini, etc...), collision prevention (TDMA, polling, and their scheduling kin - i.e. Canopy, etc...), or it's simply FDD (modern expensive point-to-point, or old-school EoAMPS).

        Even current wired Ethernet versions (1G, 10G) have dropped collision detection, opting to go full-duplex exclusively. Also shared cables can now carry multiple
    • Also a single ganglion cell may send "10 million bits" of information,

      Re-read researchers said, and consider that maybe they've looked into this.
      If we take the article at face value, and divide, it would indicate that a single ganglion cell carries about 8.75 bits per second. So you're suggesting that they are off by several orders of magnitude.

      Despite the differences in how brains and computers relay information, it is perfectly valid to estimate any kind of transission capacity in bits. How else w
    • The eyes have it....

      Well, I was all eyes for the article I partly read yesterday or early this am.

      Well, even IF the eyes transmit like a network, the eye study is not apples and apples. More like oranges and mangoes.

      How many libraries of eye sockets worth of information is that?

      I mean, look at the size of the Guinea pig's eye. Of COURSE it transmits less energy. I mean more data to the brain. It's not as if it enlarges to accommodate more data. Hell, the human eye is probably 10 times LARGER without expandi
      • EYE was of the impression that it would have been funny, but EYE see interesting as an enlEYEtening and eyepropriate substitute....

        (man, slash's word generator is too funny: image word: "knifed".. I'm thinking corrective surgery now... considering this topic...)
    • One obviously forgets about spirituality. Science has made some steps to set limits to where about "the spirit" is within the body, and they don't believe it is just the brain anymore with the newer theories of quantum physics. There is the duality with quantum particle that appear to exists in two physical locations at once, but are still considered as one object. Quantum theorists have also suggested that "the spirit" actually is like that in which a part is in every cell like a node. On that note, the br
  • Inaccurate blurb. (Score:3, Interesting)

    by pikine ( 771084 ) on Friday July 28, 2006 @09:56PM (#15803472) Journal

    anthemaniac writes:

    "In the blink of an eye, you can transfer files from one computer to another using Ethernet. And in the same amount of time, your eye sends signals to the brain. A study finds that images transferred to the brain and files across an Ethernet network take about the same amount of time."

    The amount of time you transmit data over a network depends on round trip time and bandwidth product, which determines TCP window size that optimizes the send/ack of data packets. You also need to take collision into account.

    The ganglion cells are probably more analogous to link transmitter. The measurement is on the amount of information generated by these cells per second. The proper conclusion is that you could probably use ethernet to connect the eyes and your brain, and the required bandwidth is supported.

    • It's worse than that! "In the blink of an eye, you can transfer" I can't see when I am blinking!
    • The article also doesn't mention the other crucial time factor, beyond transmission speed: the lag between unconscious perception (when the signal from the eye has reached the brain) and conscious perception (when you are aware that you see something). This lag of roughly half a second was first measured in the 1970s by psychologist Benjamin Libet. We don't sense any lag, though, because we automatically antedate the experience of our sensory inputs, pushing it all a half-second into the past, and thus expe

  • but, my eyes aren't encoding with divx ;)

    So, if what I see is roughly the same throughput as ethernet.

    Maybe I should rub my eye against a cat 5 cable connected to a computer!
    • Actually, "encoding with divx" is surprisingly close to what your eyes are doing with every scene and every image they look at. There are neural mechanisms in the eye that perform edge detection and motion detection, not bothering to transmit information about areas of a single colour, so that the information that is actually transmitted to the brain is a heavily simplified, stylised, and encoded version of the image projected onto the retina. Indeed, one of many remarkable aspects of the human visual syste
  • Oh.... (Score:3, Funny)

    by ElephanTS ( 624421 ) on Friday July 28, 2006 @10:22PM (#15803564)
    I was wondering what that RJ45 socket on my head was for. My kids will probably be wireless.
  • by cryptonix ( 163498 ) on Friday July 28, 2006 @10:24PM (#15803575)
    Arr! they only get 10/half
  • by BilZ0r ( 990457 ) on Friday July 28, 2006 @10:32PM (#15803609) Homepage
    The OP doesn't say that a single retinal cell transmits 10 million bits a second, but that the whole eye does. On top of that, while discussion of collition detection is pointless, thinking about the information a neuronal population can encode does have some merits. Although it's relatively pointless (at least now) to compare the eye to an ethernet, it has uses in comparing different neural populations.

    The problem is that getting bitrates for neuronal populations is more of an art that a science. The sum total of information passed on by a neuron can not be computed simpley by it's spiking rate. Large numbers of parameters alter the actual chemical I/O relationship of a neuron. Resting membrane potential before spiking, whether it shows short term facilitation/depression etc...
  • Did anyone else read "100,000 gajillion cells"?
  • My eithernet is the same speed as my eyes. My eyes can see my eithernet. My Eyes can see a duck. Therefor, if my eithernet weighs the same as a duck, its a witch!
  • it looks like it's running slow.
  • So how many guinea pigs would it take to see all the data in the Library of Congress in one second?

    Google doesn't seem to have this conversion (yet).

  • These eyes run Gigabit...What now
  • AT&T (Score:3, Funny)

    by richdun ( 672214 ) on Friday July 28, 2006 @11:18PM (#15803750)
    Oh great, now AT&T is going to charge me more to see certain things than others. Stupid eye neutrality.

    (let's see how many pick up on the joke here...)
    • Ya,I get the joke :> But of course, that's if you're assuming AT&T owns your eyeballs. That would be quite the pact with the devil eh
  • ...and all this time I've visualized a series of tubes.
  • Um yeah, I dunno... (Score:3, Interesting)

    by DavidD_CA ( 750156 ) on Friday July 28, 2006 @11:37PM (#15803813) Homepage
    Here's how I look at it... the human eye has a "resolution" far greater than that which any monitor supports, and certainly greater than any streaming video I have ever seen.

    Add to that the color depth of the human eye. Granted, not 16 M colors, but still pretty high.

    The frame rate of the average human eye is somewhere around 40 fps, I believe. Again, faster than what most streaming videos offer.

    Then double all that, 'cause we got two eyes.

    I'm pretty sure the "bandwidth" between my eyes and brain is a little faster than even the best ethernet connection.. At least anything that I've seen demonstrated so far.
    • The frame rate of an eye isn't governed by the eye itself - given the transfer is analog, the frame rate is governed by what the visual cortex can handle. Which, if you have done any animation, is about 25 frames per second - a speed at which the human visual cortex cannot percieve the individual frames making up an animation.

      I doubt the data transfer rate of any nerve is anywhere near the transfer speed of ethernet - I read an article once stating that human nerve tissue could transmit information at abou
      • It's not that simple. Neurons in eyes aren't synchronically read, like in monitors. Every neuron fires about 4 times per seccond, but at any given time there are many neurons firing. So this bandwidth is calculated by measuring how often neurons fire and how many there are working. And you DO NOT have such resolution as your monitor. If you had, you could see all pixels at once. You have greater pixel density only in central spot.
        • And you DO NOT have such resolution as your monitor.

          Actually you do. However, you don't normally press your eyeball against the glass. At a distance of about .5m, which is normal, your eyes don't have sufficient resolution any more. If you move your head up close to the glass, you should be able to perceive the individual pixels. It's important to remember that there are three dimensions here. The expected viewing distance determines the necessary dpi of the viewing device.

    • ....I'm pretty sure the "bandwidth" between my eyes and brain is a little faster than even the best ethernet connection......

      The data sent to the brain by the eye is not nearly as large as assumed from the amount that actually enters the eye. The light sensitive cells of the retina are coupled to an underlying system of cells, richly provided with blood. Each cell has what amounts to a CPU and program which processes data locally. This concentrated data is collected and further processed by the connector ce
    • The frame rate of the average human eye is somewhere around 40 fps

      Human eyes operate asynchronously, they don't have a fixed flicker rate like that. As such, they can spot things that happen much faster than 40fps, but only if those things are large enough - it's all dependant on the number of receptors that can detect the change.

      For example: if you had a 50fps video camera, and you flashed an image at it for 1/50th of a second, then the camera probably wouldn't detect anything at all - the image most likel


    • Add to that the color depth of the human eye. Granted, not 16 M colors, but still pretty high.


      I would argue that the color depth of the human eye is much better than 16 M colors. Particularly when one color "fades" into another, I routinely see "color lines" where two very similar colors match up. 16 M colors is sufficient is where there's alot of contrasting colors and/or complex patterns. Heck, make the picture sufficiently contrasted/complex and 256 colors can do a decent job!

      What's hard is the the reali
  • GPE = Guinea Pig Equivalent
  • by sbaker ( 47485 ) * on Friday July 28, 2006 @11:42PM (#15803832) Homepage
    The numbers presented here are very misleading. You get the impression that your eyes are transferring video images as a bunch of pixels at the relatively slow speed of an Ethernet connection. But that's not true. Video processing starts right there in the retina and steadily changes the data from pixel-like date to edges, lines, shape to recognised objects to high level concepts that are conveniently tagged with memories, emotions and other relevent data.

    At what point are we measuring the data? If the data that's actually being measured is something like "My Mom standing next to a table with a vase full of flowers on it" - then having 10 Mbits/sec is a heck of a lot of data. If it's raw video - then it's pathetically little.

    We can estimate the bandwidth your eyes could theoretically produce if they were transmitting "raw video". We know that the retina has a resolution of around 5k x 5k "pixels" and we can see motion at around 60Hz and we have more dynamic range than we can display with 12 pixels each for Red, Green and Blue. So at the 'most raw', two eyes would require 5k x 5k x 60Hz x 2 x 12 x 3 bits per second. That's 108 Gbits/sec - which is vastly more than the 10Mbits to 100Mbits this article suggests. You can argue about the details of the numbers I used here - but we're looking at four orders of magnitude - so I have to be a LOT wrong!

    So it's pretty certain that what they are measuring in TFA is some kind of condensed or summarized version of the visual data.

    That being the case, it's pretty silly to be comparing "My Mom standing next to a table with a vase full of flowers on it" to a 640x480 JPEG file. It's simply not an 'apples and apples' comparison.
    • They measured raw video output directly from light sensitive cells. And it's not thaat every pixel is refreshed at 60hz. It's rather (if you RTFA) 4Hz for each "pixel". But during each millisecond there are many pixels transfering, so we perceive it as continuous transmission.
  • ... that my brain is made up of a series of tubes!
  • So that's how they get those digital eyes in GITS (Kokaku Kidotai) [manga.com]
  • Clearly we're not even comparing apples and oranges here, we're comparing apples and pianos. But it is amusing nonetheless.

    Reminds me of a few years back when Apple advertised their latest computers as being "Faster than light". This tagline was withdrawn a few weeks later under attack from the nerd community, but not before some mac fanboys created the amusing argument that the computer completes a floating point calculation in less time than it takes the light from the monitor to reach your eyes. By th
  • I've got gigabit ethernet on my computer, so I'm obviously going to have to upgrade to gigabit retinal nerves. Do they sell those on NewEgg?
  • They failed to mention, that a computer sees a one inch red square as about 158,544 Bits where as the human mind sees it as just one small red square.....
  • Computers and the biologically world handle data differently. Computers use digital data. Humans perceive the world in an analog fashion. So we can view a picture at roughly the same speed as you can transfer it across a network. Now change that to text, and I'll bet your calculations are way off; ethernet will transmit text many times faster than the human eye can read it. There's no point to figuring out how many "bits" the human eye can read in a second, because the human eye doesn't read bits. This is j
  • by ltwally ( 313043 ) on Saturday July 29, 2006 @02:34AM (#15804443) Homepage Journal
    Who cares about the transfer speed. What I want to know is what kind of ping I'm getting.
  • How long before I can upgrade my internet tubes to guinea pigs?
  • And i was looking forward to getting a gigabit interface installed in my skull once cyberjacks became available too :p
  • Pr0n is downloaded twice as much as previously calculated.
  • Comment removed based on user account deletion
    • Ever tried pouring beer over your router/server/computer/etc ?

      And that experiment might be even more fun and exciting when all our routers and ethernet networks are replaced with guinea pigs!

      Makes me wonder when there will be a new KDE network app coming out called Kuinea Pig or something...
  • Yes, but this is how many LoCpf (Library of Congress per fortnight)?
  • The cluster, not the corporation necessarily. Although that would be an interesting question too.
  • A bit in computer science has a well defined meaning - a binary digit with a value of 1 or 0. What's a "bit" of information in the brain? I don't think we store data digitally...
  • Did any one else read "100,000 ganglion" as "100,000 gajillion" as in THE LARGEST NUMBER EVER CREATED BY MAN?

    I mean .. uh .. my brain's ethernet wasn't fast enough to stop me from making a stuipd comment! Where's my firewall!
  • Putting a transfer rate on the eye brain interlink seems tenuous at best. Our brains don't transfer "bits", they use a compression technology that is more about conceptual patterns and less about adjacent colors, and I'm fairly certain (at least in my case) that our brains lose more information than the average internet link. But, why make the comparison, is it just to continue a long line of "we're just computers".

    We're not, we're much more complex, but this isn't going to stop the likes of "futurists"
  • Nonsense, off Corrs ---- the amount of bits of information is completely irrelevant in this comparison - it's like comparing kilograms and pounds in one to one scale; the Reason is that to see that Wonderful World around, those pigs visual systems have to process those bits of *information* over multiple layers of neural networks, some of them recursive, then it goes into the mind and...over the hills and far far away we have our Real World. Knowledge aboot it's existance and properties. The gig. The feelin
  • To get a better idea of what the eye really "sees", you need to look at what it would take to generate a realistic video replacement, like IMAX.

    IMAX resolution is ~ 10,000 x 7,000 x 24 bit color @ 24 fps = 40,320,000,000 = 40 gigabit/sec

    And even IMAX resolution is far from what reality, perhaps by as much as a factor of 10. So while the eye may not see at a rate of 40 gigabit/sec, the brain can "see" and process at that rate.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...