I was told that a similar question was asked of someone in the UK fusion programme about 2 or 3 years ago by the director. The guy went away and did some sums to answer the question:
Given the money to build it now, how well could a fusion plant be constructed, and what would be the cost of electricity produced.
His answer was something like as follows:
To build a power plant now would require working around the current problems (such as ELMs) by creating a machine to run in L-mode (low confinement mode) rather than H-mode (ELMs only occur during increased-efficiency H-mode) and so creating it much larger than it should need to be - therefore it can be run at low power, "easy", and low risk* mode, ie run a large machine incredibly conservatively, which isn't incredibly efficient.
The cost per kilowatt hour for the lifetime of a machine would have been about 50p (~80cents). So an improvement of ~10x would put it viable. And given the conservative nature of the machine calculated, that's not far away (running in H-mode would make a large difference for example).
On the other hand, to put ICF's recent claim that they will beat ITER into perspective, I believe that at NIF they are currently firing the lasers maybe once per day, then replacing the inner optics between each shot, and carefully placing and targetting the ~millimetre target between shots. I think the point at which they reach viable fusion is 100 times per second. Last I knew they were awaiting the invention of a solid state laser capable of achieving their requirements - which was by no means on the horizon. To reach ~100 times per second, the target would need to be fired into the chamber at ~100m/s, and then still be hit by all 192 beams simultaneously on the millimetre scale target.
Then again, NIF's goal is really for studying nuclear weapons.
If they've made progress recently, and are closer than I think, then I'll be pleasantly humble, but ICF's wild claims that they'll do it this year don't go far to dissuade the outside view of scepticism towards any claims about how far off fusion may be.
Said claims haven't been helped by the delay of 20 years over the building of ITER. So when people say "20 years is up, where's fusion?" in reality 20 years ago MCF researchers were waiting for the same thing they're now waiting for. In the last 20 years much has been learnt, understood and improved... but it's still the same 20 years away as it's still the same machine away.
On an aside, the probably problem was that in the 50s or thereabouts, someone performed the calculation to see how big a reactor would need to be to break even according to the Lawson criterion. It was relatively tiny, order 0.5-1m (I don't remember exactly). At this stage they figured that 20 years should be enough. In the next 20 years, turbulence was found to be more than "something we might need to account for" and in fact it's the main heat transport mechanism going on in a hot plasma. Then it was found that larger machines were needed, and from that last 50 years, ITER should be painfully close to finishing the work (some argued that ITER should have spent a little more to include lithium blankets needed for extracting energy and breeding tritium - which would be great for PR to show that it could put energy out to a grid... but extra cost, reduced access to the machine for improvements and essentially no benefit. That should be the purpose of DEMO (the next machine after ITER - a demonstration powerplant)).
*risk of not working/elms/disruptions, not risk to people.