On it's surface, it looks like a lot of the results they're getting wouldn't currently be outside of the realm of student level work, such as the simple practice of projecting and baking textures into materials from photographs, the innovation seems to be that they're quickly automating a lot of that stuff into a UI with a fast lighting solution. One of the things I find most rewarding about 3d is that you sometimes get this huge burst of increased productivity, as long as you're not too bummed out about things you've spent time and energy learning how to do becoming obsolete. This isn't that different, fundamentally, than setting your viewport background in Maya, 3ds Max, etc. to be a photograph after properly matting your foreground objects and projecting textures with adjusted reflectivity, just without all of the manual tediousness. Also, there's also been other, similar work done on the subject, that I've heard of, but this still looks pretty neat if it's something you can use right now without a billion dollar computer.
One of the big things this tech might be doing is streamlining the process of match lighting. I personally can't wait till the major software packages have integrated solutions for easy lighting from photo sources. Currently the setup for photo matting is a pain, it requires stitching together panoramic photos of reflective chrome spheres - on location - or carefully using observation skills to recreate the lighting by hand (which can be very difficult for glossy surfaces). It would appear, however, that we're on the brink of not needing those things anymore. That being said, this software still has a bit to go, however.
For example, the lighting information baked into the diffuse textures of the objects, in these examples, does not appear to be dynamic - if you watch the taxi-spinning segment you'll notice that the specular highlights on the hood of the car do not properly update as the orientation of the model changes in relationship to the light sources, making the taxi appear to have white paint streaks once rotated out of alignment with the light source. The car falling off the cliff example is probably the most apparent in final results, as the strong baked lighting makes the coloring look off. The way we 3d artists get around this problem is to eliminate the lighting information in our diffuse textures as much as possible before reapplying them as flat color, and then let our lighting rigs take care of the reflections, shadows, and such. As they mention this software doesn't support transparency, and I would guess is rendering everything as matte objects, meaning the renderer probably isn't robust enough to handle anything coming close to complicated reflections/refractions and so on, making this software's usefulness very situational, currently. It would be a great way to quickly populate photos with hordes of smaller objects, for example. However, with a more powerful renderer, feature wise, this tech could be really useful for the Photoshop crowd. I wish Autodesk/Mental Ray would focus on stuff like this instead of the boring crap updates we usually get (Maya's new fluids are pretty cool though, tbh...).