Visualizing Ethernet Speed 140
anthemaniac writes "In the blink of an eye, you can transfer files from one computer to another using Ethernet. And in the same amount of time, your eye sends signals to the brain. A study finds that images transferred to the brain and files across an Ethernet network take about the same amount of time." From the article: "The researchers calculate that the 100,000 ganglion cells in a guinea pig retina transmit roughly 875,000 bits of information per second. The human retina contains about 10 times more ganglion cells than that of guinea pigs, so it would transmit data at roughly 10 million bits per second, the researchers estimate. This is comparable to an Ethernet connection, which transmits information between computers at speeds of 10 million to 100 million bits per second."
So if I plug enough CAT5 cables into it... (Score:5, Funny)
Re:So if I plug enough CAT5 cables into it... (Score:5, Funny)
Re:So if I plug enough CAT5 cables into it... (Score:2, Funny)
Re:So if I plug enough CAT5 cables into it... (Score:2)
Re:So if I plug enough CAT5 cables into it... (Score:2)
Re:So if I plug enough CAT5 cables into it... (Score:2, Insightful)
720 * 576 = 414,720 lu
Re:So if I plug enough CAT5 cables into it... (Score:2)
Re:So if I plug enough CAT5 cables into it... (Score:5, Insightful)
Remember that humans don't see a pixel-per-pixel representation of the world. We see a tight spot of color and detail in the center of our retina (Fovea? Bio-types please correct me) surrounded by blurry shapes and lines. Around the edges, in peripheral we don't even see color, just luminance.
Proof? Take a bright LED lamp and move it into your peripheral vision. What color is it, not from memory, but just from looking at it?
The Fovea-area of the retina is more densely packed cells and blood vessels. It has more cones -- Chroma-type cells -- than rods.
This indistinct image is inverted and processed into a whole by the brain, which carefully processes different shapes, lines, movement, flickering, and what-not to produce what you THINK you see. The brain fills in any given pieces of the image that don't have enough detail, frequently from memory.
This is why optical illusions work. You deceive the biological mechanisms that process the image into producing bad data by giving them a skewed sample of the image.
Also, neural mechanisms are asynchronous and really can't be mesured in a k/s rate. You perceive a flicker of motion one second and then a spot of color the next. This is assembled into a ball that you turn to face-- to get a better image-- and then catch. Your brain has a lot of built-in firmware to do image manipulation, built you have to 'learn' the software necessary to do pattern matching and response over your lifetime.
You only get a few bits worth of information for the first few milliseconds that you're recognizing the ball, and then many megabytes worth the last few second.
Another thing... as sensitive and immersive as vision is, your ears probably have much, much more data input. They have vastly more dynamic range. Most people don't even notice themselves filling in visual information with audio information, but it does happen.
For example, you hear a person's voice, and you *think* you see their face.
Close your eyes when talking to someone, especially when in a group. Note how easy it is to visualize faces just from hearing voices.
I'm not denying that the brain has massive throughput from the senses, but you really shouldn't try to measure it in digital terms. It's all analogue.
Re:So if I plug enough CAT5 cables into it... (Score:1)
Re:So if I plug enough CAT5 cables into it... (Score:2)
Re:So if I plug enough CAT5 cables into it... (Score:2)
The bottleneck in the transmission it the optic nerve, not the retina. Because of this, the retina does an immense amount of pre-processing and filtering before the information is even dispatched to your brain.
What this article describes is akin to gauging the speed of transmission by assuming a bitmap file is sent through the tubes, when really it's a jpeg compressed version of the bitmap.
These computer analogies to brain function are interesting, and ins
Re:So if I plug enough CAT5 cables into it... (Score:2)
Blah, so all you're saying is we have VBR. No big deal :P
All kidding aside, a very insightful post.
Re:So if I plug enough CAT5 cables into it... (Score:2)
Re:So if I plug enough CAT5 cables into it... (Score:3, Interesting)
This came up in another discussion a while back, but I suspect that even an average digital camera with a good, wide-angle lens probably captures in a single frame more raw information than the human eye does from the same vantage point in a single glance.
You only think that your eye is a really good camera. In reality, it might be pretty bad -- I su
Re:So if I plug enough CAT5 cables into it... (Score:1)
Re:So if I plug enough CAT5 cables into it... (Score:1)
It's pretty fasc
Re:So if I plug enough CAT5 cables into it... (Score:1)
Moderation -1
100% Overrated
I add an idea and I'm Overrated? What is wrong with the moderators?
Re: (Score:1)
Re:So if I plug enough CAT5 cables into it... (Score:1)
Thanks. Now I have a sig.
Re:So if I plug enough CAT5 cables into it... (Score:1)
Besides, cats eat guinea pigs.
LK
Re:Ahh, the joys of Sysadmin's Day... (Score:2)
Nice comparison (Score:4, Interesting)
Yes, but we have better encoding.
Re:Nice comparison (Score:1, Interesting)
Re:Nice comparison (Score:3, Informative)
Re:Nice comparison (Score:2)
Even so, there most smart, intelligent scientists try to tell us that this amazing system came about without the application of a superb designer of great intelligence, but by some random statistical processes.
Re:Nice comparison (Score:2)
The cells of the retina do a large amount of very complex signal processing before they pass the optical information onto the brain where it is interpreted. Thus the amount of data the optic nerve must carry is much less than what the visual acuity of the eye receives. Each retinal cell has a sophisticated biological CPU built into it. The compression is nowhere near the same as any digital compressi
Re:Nice comparison (Score:2)
Re:Nice comparison (Score:1)
Re:Nice comparison (Score:2)
Neuroscience != Computer Science (Score:5, Informative)
Re:Neuroscience != Computer Science (Score:2)
I'm just trying to figure out if I should have dodged that station wagon full of porn DVDs or not.
collision detection (Score:2)
You don't need collision detection if the connection is end-to-end and one way. The reason why wired and wireless ethernet have collision detection is because multiple interfaces are accessing the same channel. If you have multiple eyes on the same optical nerve, you would need collision detection.
Re:collision detection (Score:2, Informative)
Even current wired Ethernet versions (1G, 10G) have dropped collision detection, opting to go full-duplex exclusively. Also shared cables can now carry multiple
Re:Neuroscience != Computer Science (Score:1)
Re-read researchers said, and consider that maybe they've looked into this.
If we take the article at face value, and divide, it would indicate that a single ganglion cell carries about 8.75 bits per second. So you're suggesting that they are off by several orders of magnitude.
Despite the differences in how brains and computers relay information, it is perfectly valid to estimate any kind of transission capacity in bits. How else w
Re:Neuroscience != Computer Science (Score:3, Interesting)
Well, I was all eyes for the article I partly read yesterday or early this am.
Well, even IF the eyes transmit like a network, the eye study is not apples and apples. More like oranges and mangoes.
How many libraries of eye sockets worth of information is that?
I mean, look at the size of the Guinea pig's eye. Of COURSE it transmits less energy. I mean more data to the brain. It's not as if it enlarges to accommodate more data. Hell, the human eye is probably 10 times LARGER without expandi
Re:Neuroscience != Computer Science Well (Score:2)
(man, slash's word generator is too funny: image word: "knifed".. I'm thinking corrective surgery now... considering this topic...)
Re:( Neuroscience + Spirituality) !~ CS (Score:1)
Inaccurate blurb. (Score:3, Interesting)
anthemaniac writes:
The amount of time you transmit data over a network depends on round trip time and bandwidth product, which determines TCP window size that optimizes the send/ack of data packets. You also need to take collision into account.
The ganglion cells are probably more analogous to link transmitter. The measurement is on the amount of information generated by these cells per second. The proper conclusion is that you could probably use ethernet to connect the eyes and your brain, and the required bandwidth is supported.
Re:Inaccurate blurb. (Score:1)
The consciousness time lag (Score:2)
The article also doesn't mention the other crucial time factor, beyond transmission speed: the lag between unconscious perception (when the signal from the eye has reached the brain) and conscious perception (when you are aware that you see something). This lag of roughly half a second was first measured in the 1970s by psychologist Benjamin Libet. We don't sense any lag, though, because we automatically antedate the experience of our sensory inputs, pushing it all a half-second into the past, and thus expe
well well well (Score:1)
So, if what I see is roughly the same throughput as ethernet.
Maybe I should rub my eye against a cat 5 cable connected to a computer!
Re:well well well (Score:1)
Oh.... (Score:3, Funny)
but what about pirates with eye patches?! (Score:3, Funny)
Re:but what about pirates with eye patches?! (Score:1)
Re:but what about pirates with eye patches?! (Score:2)
Re:but what about pirates with eye patches?! (Score:4, Funny)
With parroty?
Neurons make my head hurt (Score:3, Informative)
The problem is that getting bitrates for neuronal populations is more of an art that a science. The sum total of information passed on by a neuron can not be computed simpley by it's spiking rate. Large numbers of parameters alter the actual chemical I/O relationship of a neuron. Resting membrane potential before spiking, whether it shows short term facilitation/depression etc...
My Eyes... (Score:1)
Re:My Eyes... (Score:2)
So... Logically (Score:2, Funny)
Re:So... Logically (Score:2)
I got better.
Re:So... Logically (Score:2)
"More eye of newt."
"If it were up to you it would be nothing but newt eyes."
It loops back to the eye analogy in TFA.
Tried to read TFA, but (Score:1)
Conversion factor - Guinea Pigs to LOC (Score:2)
Google doesn't seem to have this conversion (yet).
Re: (Score:2)
Gigabit (Score:1)
AT&T (Score:3, Funny)
(let's see how many pick up on the joke here...)
Re:AT&T (Score:2)
Darn it... (Score:1)
Um yeah, I dunno... (Score:3, Interesting)
Add to that the color depth of the human eye. Granted, not 16 M colors, but still pretty high.
The frame rate of the average human eye is somewhere around 40 fps, I believe. Again, faster than what most streaming videos offer.
Then double all that, 'cause we got two eyes.
I'm pretty sure the "bandwidth" between my eyes and brain is a little faster than even the best ethernet connection.. At least anything that I've seen demonstrated so far.
Re:Um yeah, I dunno... (Score:2, Informative)
I doubt the data transfer rate of any nerve is anywhere near the transfer speed of ethernet - I read an article once stating that human nerve tissue could transmit information at abou
Re:Um yeah, I dunno... (Score:1)
Re:Um yeah, I dunno... (Score:3, Informative)
Actually you do. However, you don't normally press your eyeball against the glass. At a distance of about .5m, which is normal, your eyes don't have sufficient resolution any more. If you move your head up close to the glass, you should be able to perceive the individual pixels. It's important to remember that there are three dimensions here. The expected viewing distance determines the necessary dpi of the viewing device.
Re:Um yeah, I dunno... (Score:2)
The data sent to the brain by the eye is not nearly as large as assumed from the amount that actually enters the eye. The light sensitive cells of the retina are coupled to an underlying system of cells, richly provided with blood. Each cell has what amounts to a CPU and program which processes data locally. This concentrated data is collected and further processed by the connector ce
Re:Um yeah, I dunno... (Score:2)
Human eyes operate asynchronously, they don't have a fixed flicker rate like that. As such, they can spot things that happen much faster than 40fps, but only if those things are large enough - it's all dependant on the number of receptors that can detect the change.
For example: if you had a 50fps video camera, and you flashed an image at it for 1/50th of a second, then the camera probably wouldn't detect anything at all - the image most likel
Re:Um yeah, I dunno... (Score:2)
Add to that the color depth of the human eye. Granted, not 16 M colors, but still pretty high.
I would argue that the color depth of the human eye is much better than 16 M colors. Particularly when one color "fades" into another, I routinely see "color lines" where two very similar colors match up. 16 M colors is sufficient is where there's alot of contrasting colors and/or complex patterns. Heck, make the picture sufficiently contrasted/complex and 256 colors can do a decent job!
What's hard is the the reali
New router throughput metric (Score:1)
But the data isn't "pixels" or anything.... (Score:5, Insightful)
At what point are we measuring the data? If the data that's actually being measured is something like "My Mom standing next to a table with a vase full of flowers on it" - then having 10 Mbits/sec is a heck of a lot of data. If it's raw video - then it's pathetically little.
We can estimate the bandwidth your eyes could theoretically produce if they were transmitting "raw video". We know that the retina has a resolution of around 5k x 5k "pixels" and we can see motion at around 60Hz and we have more dynamic range than we can display with 12 pixels each for Red, Green and Blue. So at the 'most raw', two eyes would require 5k x 5k x 60Hz x 2 x 12 x 3 bits per second. That's 108 Gbits/sec - which is vastly more than the 10Mbits to 100Mbits this article suggests. You can argue about the details of the numbers I used here - but we're looking at four orders of magnitude - so I have to be a LOT wrong!
So it's pretty certain that what they are measuring in TFA is some kind of condensed or summarized version of the visual data.
That being the case, it's pretty silly to be comparing "My Mom standing next to a table with a vase full of flowers on it" to a 640x480 JPEG file. It's simply not an 'apples and apples' comparison.
Re:But the data isn't "pixels" or anything.... (Score:1)
This must mean ... (Score:2)
Hardware puzzle (Score:1)
Fun but irrelevant (Score:1)
Reminds me of a few years back when Apple advertised their latest computers as being "Faster than light". This tagline was withdrawn a few weeks later under attack from the nerd community, but not before some mac fanboys created the amusing argument that the computer completes a floating point calculation in less time than it takes the light from the monitor to reach your eyes. By th
Well then, time for an upgrade. (Score:2)
Umm, A little more info would be nice.. (Score:1)
Apples and oranges (Score:2)
who cares about transfer speed? (Score:3, Funny)
What about the tubes? (Score:2)
I guess no Gigabit Jack. (Score:1)
In that case, (Score:1)
Re: (Score:2)
Too much alcohol can harm computer hardware too (Score:1)
And that experiment might be even more fun and exciting when all our routers and ethernet networks are replaced with guinea pigs!
Makes me wonder when there will be a new KDE network app coming out called Kuinea Pig or something...
Confused I am (Score:1)
So, when does Google become self aware? (Score:1)
I don't think a "bit of data" is the same as a bit (Score:1)
Ganglion? (Score:1)
I mean
Re:Ganglion? (Score:1)
I was all like "O RLY???", slapped myself upside the head, and read it again.
Please don't do this to us.... (Score:2)
We're not, we're much more complex, but this isn't going to stop the likes of "futurists"
Yet Another Overslept Armageddon...SYNTAX ERROR (Score:1)
Compare to IMAX (Score:1)
IMAX resolution is ~ 10,000 x 7,000 x 24 bit color @ 24 fps = 40,320,000,000 = 40 gigabit/sec
And even IMAX resolution is far from what reality, perhaps by as much as a factor of 10. So while the eye may not see at a rate of 40 gigabit/sec, the brain can "see" and process at that rate.
Re:Security Vulnerability (Score:1)
Re:Security Vulnerability (Score:2)
Re:Security Vulnerability (Score:1)
ps
stop looking at that.
Wrong article! (Score:2)
Re:Yeah, that will teach you to lick your boyfrien (Score:1)
slashdot problem? (ot reply) (Score:2)
Re:Break out the SI units, we have a new one (Score:2)
Re:10 to 100 million? gig and 10 gig (Score:1)
But the point of the article was to provide a way to visualize the (native, uncompressed) bandwidth of the eye, not to draw any comparisons between cutting edge technology and the human eye. Most desktop users are familiar with 10/100 ethernet rather than with gig and 10-gig ethernet, so 10/100 forms a better basis of comparison.
Re:WOO HOOO 8th post (Score:2, Funny)
Why do people do this? (Score:2)
Why, after we've bothered to create this 'community' does someone feel the need to crap all over it like this? What possesses peo