Certainly the nuclear reactor industry has done "just fine" without these detailed calculations for the last 60 years. Where "just fine" is: "We've seen stuff fail over the years and learned from it and kept tweaking our design and margins to take it into account". They have use simplified models to get an idea of the behavior and it has worked for them (as far as the reactors run safely and reliably).
However, the "margins" are the name of the game here. If you can do more detailed calculations that take into account more physics and geometry you can reduce the margins and provide a platform for creating the next reactor that is both more economical and safer. If you can increase the operating efficiency of a nuclear reactor by even 1% that is millions of dollars. If you can keep something like Fukushima from happening that is even more money (some would say "priceless").
The approximate answers (using simplified models) are good - they are in the ballpark. But if you compare their output to experimental output (which we have a LOT of... and it is VERY detailed) the simplified models get the trends right... but miss a lot of the outlier data. That outlier data is important... that's where failure happens. With these detailed models we get _much_ closer to the experimental data.
To get even closer to the experimental data we have to get even more detailed. The movie showed some of our early work in multi-scale simulation: where we were doing coupled microstructure simulation along with the engineering scale simulation. That work is necessary to get the material response correct to get even closer to the experimental data.
Ultimately, if we can do numerical experiments that we have a great amount of faith in, it will allow us to better retrofit existing reactors to make them more economical and safe and design the next set of reactors.