The correct term for that is "chicken lips". See here: http://www.youtube.com/watch?v=U34m_XSJaLA
No, you are not. It's totally a rubbish bin.
Incorrect. A single x1 PCI Express 3.0 lane does 8Gb/s. Thunderbolt 1.0 has two 10Gb/s lanes, while 2.0 is exactly the same, except it can aggregate two lanes to deliver a single 20Gb/s connection, just like PCI Express has done all along.
Assuming each of those ports actually connects full speed to the main system (I'm fairly certain it doesn't, because no one will use it that way, but it could), that's essentially 12 PCI x1 lanes to play around with. That's not horrible, given you already have the two GPUs accounted for. Of course, those are your GPUs for life, but they'll be decent for awhile yet.
And the GPUs are the reason there's not likely full 6-channel Thunderbolt to/from the main board. You've seen an HDMI port, but no others... that's because you're expected to hook your main monitors (and anyone using a system like this with only one monitor is a fool) to Thunderbolt ports. So, in a basic configuration, you'll probably eat two of those right away. And there's a pretty good chance the system has a cross point switch that allows the DisplayPort connection directly from each GPU to be routed directly to one of the Thunderbolt ports, and put it into DisplayPort mode. DisplayPort already supports full 60p 4K displays, no need to wait for a Thunderbolt monitor or live on HDMI alone (1.4 supports 4K at 24p).
I concur.. the E5 V2 has already been announced going to 12 cores. Not out yet.. but Apple's certainly waiting on something. As well, you do not see a place in that box for a second processor. Nope, it ain't there.
At 50-100GB per hour for video, things can get really busy, really fast, in a video editing house without large local storage. Particularly if everything's on GigE, which starts getting really slow when you're not the only one schlepping video across it. Users of this for serious media work will need a local Thunderbolt drive in addition to everything else.
Thunderbolt doesn't store a single bit. Maybe you meant "eSATA is dead"? You're still going to need a place for that Thunderbolt cable to find some storage, and you're probably going to want it to be redundant. Sure smells like a RAID to me. And there's such little storage in the system, external working drives will not be optional.
And for lots of media creation, it's not just the single stream throughput of the drive, but the aggregate performance across dozens, maybe hundreds of files. RAID isn' always a better answer here than multiple, independent drives. Of course, you can configure most RAID controllers to just to JBOD if that's what you need, but either way, it's still one more box to deal with. Companies may have a SAN, but then you're bottlenecked at GigE.
While it depends on what you're doing, if Apple's still selling to the professional media content market, they'll need some big RAM in this thing. I actually used an 8GB system for some years, with all the RAM I ever needed, doing HD video, electronics CAD, all kinds of things. It was seemingly mundane photo editing that got me to upgrade to 16GB, and has me drooling a little at new systems that'll go to 32GB or more. That's not for a single photo, but when you start shooting with 18-20+Mpixel cameras, in RAW, in HDR, and doing 20-60+ shot panoramas, that memory goes fast. I have individual composited photos well over a gigabyte in size.
And that's another issue here... PCI Express SSD may be wicked fast, but the sizes available will make it expensive and still totally unsuitable to media work. An external working HDD will be necessary, as well as the LAN, or RAID you're using for archival purposes.... or, oh, never mind, Apple discontinued their SAN years ago.
I was thinking iWastebasket, but either way...
Only the chips that let you occupy two 80MHz channels. That's optional in 802.11ac.
Depends on what you're looking for. Each Thunderbolt cable delivers two 10Mb/s links, slightly faster than an PCI Express 3.0 x1 link, or the new 20Mb/s single link, which is, not surprisingly, slightly faster than a PCI Express 3.0 x2 aggregate. That's enough for many things, but not everything.
Yeah, Apple's been all about the "laptop for the desktop". True, they're not managing to sell you monitors along with this, but everything but. The display cards look replaceable, but they're a proprietary design, and might not easily be duplicated by a third party, should Apple ever sell enough of these to make that interesting. Not to mention how tightly controlled the thermal footprint on this must be, dealing with that chimney and single fan.
For Apple fans in general, though, this is good news. This is certainly a more capable computer than anything Apple sells today. There was a very legit concern that Apple under Jobs was throwing all of their pro users under the bus... I mean, the Mac Pro of today (which, presumably, no one will every buy again) is 3.5 years old. and it's only ever been shipped with an AMD gamers's GPU card from 2009... though, given the drivers, one could always swap in a good one. And they did offer a dual-headed Mac Pro... though the upgrade to a current Xeon is certainly the better move, if you had to choose just one.
If the display connections transmute into actual DisplayPort connections when attached to a DisplayPort device, that's not going to be a problem. And I think they must, given there's only the one HDMI output, probably intended to drive a preview TV/monitor -- that's the usually need for that in a video workstation. DisplayPort links support quad-HD monitors (3840x 2160, "4K class") at 60p, HDMI 1.4 only supports quad HD at 30p and full 4K at 24p.
Anyone using professional computer monitors left DVI and HDMI behind awhile back. My system here (non-Mac... not a Mac fan for many reasons) runs two 2560x1440 monitors on DisplayPort links, one 1920x1200 on DVI, and an optional 32" HDTV on HDMI. That's not exactly a typical consumer configuration. Keep in mind, Apple left DVI behind long ago, except on the current Mac Pro, and that only because it's only ever been shipped with a consumer GPU from 2009. Thunderbolt was designed to mutate into DisplayPort (the Thunderbolt connector IS the mini-DisplayPort connector), so presumably, any monitors shipped from Apple or used with Apple PCs in recent times will plug right into this new PC.
I'm not really defending Apple here, just laying some facts on y'all.
Well, there are limits. Thunderbolt 2 isn't any faster than Thunderbolt 1, it just allows the two 10Gb/s links to be aggregated, as PCI Express has always allowed. So, assuming they're all independently connected Thunderbolt ports (eg, aggregate throughput of 20Gb/s x 6, each way), this equivalent to about 15 PCI Express x1 links, in total. Well, that's enough for one external GPU, if you're not working it hard, and some external drives. And given that it's a virtual certainly some of these Thunderbolt ports will be used in DisplayPort mode, I'll bet you don't get anywhere near 120Gb/s to and from (it's a simultaneously two-way channel, just like PCI Express) main memory.
The real question I would have is on the bus architecture. There's got to be some Thunderbolt/DisplayPort switch internally, to route the output of the GPUs to the Thunderbolt port(s). A typical AMD GPU will have at least two DisplayPort outputs, but of course that's not crossing a main bus, that's driven directly out. That lone HDMI 1.4 port is only for one monitor, and HDMI 1.4 can only drive quadHD at 30p or true 4K at 24p... not what you'd want for a workstation-class system. The rationale for Thunderbolt 2.0 was to allow faster graphics, since Thunderbolt 1.0 can't support 4K in Thunderbolt mode, only DisplayPort mode (DisplayPort 1.2 can hit 17.2Gb/s, ever-so-slightly less than Thunderbolt 2.0). So you're going to be using monitors on at least two of these ports. And again, that's going to want to switch directly from the GPU modules, not cross with anything else.
So I bet they have a big old cross point switch, which the output of the GPUs, the Thunderbolt ports, and a link to the main system (via PCI Express most likely), to keep this thing going.
Different, arcane meaning of the word. Not relevant for decades. Why do all you dweebs keep bringing it up?
I programmed my Dad's HP9100A when I was 11.....