Don't worry, things will still be nice and confusing: It is valid to use a "Type C" connector in conjunction with a USB2 chipset(at least on the peripheral end, and probably in practice on the computer end). Further, if the "Type C" connector is actually USB3, there is the matter of "Alternate mode".
"Alternate mode" allows the Type C jack and cable to act as a conduit for an entirely different protocol(Displayport and MHL have previously been announced, Intel's announcement presumably means that thunderbolt is along for the ride); but only if the system has the hardware necessary to implement whatever the other protocol is, and that hardware is suitably connected to the Type C jack in question. It doesn't actually give a USB 3.1(gen1 or gen2, yes there's that difference as well) device the ability to natively handle the other protocol in the USB silicon, merely to politely carry it from one end to the other, if the upstream device can generate it and the downstream device can accept it.
So, when you combine this with the inevitable variations in how much power is available(spec allows for up to 100watts; but given that very few laptops, much less littler widgets, even have a hundred watt brick for their own needs, it is clearly the case that most Type C ports will be good for substantially less); a Type C port can do almost anything; but is required to do effectively nothing beyond acting as a USB 2 slave device and not starting any fires when plugged in. It might have full USB 3 silicon, it might not. It might support 10GB/s traffic, it might only handle half that; it might deliver 100 watts of power on request, it might be incapable of doing much besides browning out without a powered hub to protect it. It might have implemented one or more 'Alternate mode' protocols, it might support none.
It will certainly be exciting, at least...