Nope, not quite right (except the part about higher voltages needing less thick wires.). How thick a wire must be to handle a given load depends mainly on amperage. Since this is a heat dissipation issue, where you put the wire also matters: in a wall means less heat dissipation than outdoor use which means a thicker gauge. Voltage means nothing. The only thing that voltage matters for is insulation. The higher the voltage, the better able electricity is at jumping gaps in the circuit, and so you need thicker insulation to prevent this. To give you an example, in my brother's car, one of his amplifiers uses 6AWG wiring, and runs at 60A at 12V. An overhead transmission line that uses 6AWG aluminum wire will typically carry 69kV with a maximum capacity of something like 300A. The reason it can use higher amps is because of the cooling effect of having the wire exposed to air and not near anything, and that we have increased safety tolerances for wires in areas where humans are likely to be present (buildings, cars, etc.). The insulation difference is massive. The car wiring's insulation is like 1mm plastic, whereas the transmission line uses literally meters of air between the wires and ceramic insulating suspenders that are several feet tall.
To give another example, let's take house wiring. If I have a 20A 120V circuit, I'll need to use 12AWG wire. If I have a 20A 208V circuit, I'll need to use 12AWG wire. If I have a 20A 240V circuit, again, I'll need to use 12AWG. Now you might be tempted to say, "But, there are three wire in the 208V and 240V circuits." But then I'll remind you that all the electricity, no matter the configuration*, flows back through the neutral.
* Yes, I know that's simplistic and 3-phase is even weirder and 208V is the potential between the hots and we don't touch the neutral with 208V, but it doesn't really affect my point, so fuck it.