It takes billions (not millions) of years for hydrogen atoms to fuse in the sun - that is precisely why the sun has a billions-of-years lifetime. So in building a fusion reactor, we need many orders of magnitude higher reaction rates, and to achieve them at many orders of magnitude lower densities. One way of doing this is to have much higher temperatures. The solar core temperature is about 15 million degrees and TFA has 50 million degrees for this new result, and 80 million degrees for half a second at a European reactor. This sounds unimpressive, but the reaction rates are very sensitive to temperature - proportional to about T^8 as I recall, but I didn't quickly find an online reference for this. 75 million degrees would therefore give a boost of about 5^8 which is about 400,000.
In the sun, the first reaction in the chain (proton+proton->deuterium) is the rate limiting step. In a reactor, we can provide deuterium enriched fuel and bypass this step. I don't know what the reaction rates are, but I suspect that this will be a greater benefit that the higher temperatures. You can do even better with tritium in the fuel, but your reactor becomes an intense neutron source, leading to induced radioactivity in nearby materials. Some proposed designs use these neutrons to breed more tritium from a lithium blanket around the reactor. (Once I get beyond the proton-proton chain reaction, I'm just relying on pop-science knowledge, so corrections from the more knowledgeable are welcome.)
Stars a bit more massive than the sun burn hydrogen via the CNO cycle, which has even higher temperature dependence (from memory, about T^17). I've never heard of anyone suggesting using the CNO cycle in a fusion reactor - presumably there are good reasons, but I don't know what they are. One problem is you need to wait for radioactive decays, but these have half-lives on the order of 1 to 2 minutes, and a commercial reactor would be running for much longer than that.