HDSDI uncompressed video is 1.5Gb/s. That is the standard for moving uncompressed video around inside a TV truck, whether 720p or 1080i. It rises to 3Gb/s if you're doing multiple phases of video (3D video, super slo-mo, etc). Within that 1.5Gb/s is still more than enough headroom to embed multiple datastreams and channels of audio (8 stereo pairs is the norm, some streams do up to 16). So I fail to see why 100Gb/s is necessary to transmit uncompressed video.
It's also a chicken-and-egg scenario. I'm a broadcast engineer and audio specialist. I had Ma Bell contact me about 7 years ago asking about how important uncompressed video transmission was, as they were trying to gauge a timeframe for a network rebuild to allow for uncompressed video transmission. My answer hasn't changed much in 7 years, because although moving uncompressed video from site to (in the case of Fox) Houston and then back to your local affiliate would be nice, it's completely unnecessary because by the time it reaches your house your local cable or satellite operator has compressed your 1.5Gb/s signal down to between 4Mb/s and 10Mb/s typically, making the quality gains negligible.
It will solve one problem, which is image degradation due to multiple passes of compression. Think about it... the 1.5Gb/s leaves our TV truck and gets ASI compressed into 270Mb/s (best case scenario, satellite transmission is significantly lower bandwidth, and most networks don't use an entire 270M circuit, they use less). It then arrives at the network hub, where it gets decompressed. If it's live it then goes through several switchers and graphics boxes, then gets re-compressed to ASI and sent either to another hub or to your local affiliate. (If not live, it gets put into a server which re-compresses the video even harder before playout.) Your local affiliate then decompresses it, it passes through more switchers and graphics boxes, then it gets either broadcast using 8VSB, or it gets re-compressed and passed on to your cable or satellite provider, who then un-compresses it, processes it into MPEG or some other flavor, and re-compresses it into its final 3-12Mb/s data stream for your receiver to decompress one final time.
This would eliminate several compression steps, and mean a better final image quality because you're not recompressing compression artifacts over and over and over again. A real 1.5Gb/s video frame looks like staring out a window compared to the nastiness you see when you hit pause on your DVR during a football game (also a best-case scenario, most cable/broadcast/sat providers ramp up the bitrate to the max for live sports and then set it back down shortly thereafter).
But the 100Gb/s makes no sense to me. Are you (crazy) overcompensating for latency? Are you sending 100% redundant data for error correction? Why in the world would you need that much overhead? I can't imagine it's to send multiple video feeds, the telco companies don't want you to do that because then you order less circuits from them. Plus you'd want at least two circuits anyways in case your primary circuit goes down for some reason.
(Side note: The one benefit to a TV truck using Ethernet as a transmission medium is the fact that these circuits are bi-directional. Transmission circuits nowadays are all unidirectional, meaning you need to order more circuits if you need a return video feed, meaning higher transmission costs. The ability to send return video or even confidence return signals back down the same line would be huge for us and a big money saver.)