This is one of those innovations that lends so much to the imagination. 360 degree vision, while it would take a while to get used to, becomes possible; New types of vision (I hate to say it, but think "Predator") are in the foreseeable future. Other brain interfaces such as bionic controllers or inter-brain communication become more concrete than fantasy.
However, a few questions come to mind concerning the ability of the brain to interpret this data. We are providing the optic nerve with a digital image transformed into analog neural impulses. The brain has complicated systems of filtering that allow us to recognize movement, depth, shading, object isolation and recognition, and many other things we as humans commonly take for granted. How well does the brain process this information compared to what it normally receives from the eye? Can it adapt to gain the same (maybe more advanced) filtering system if it is hindered at first? I think these are important questions to ask, but I'm sure they will be answered with time and research. In any case, it is very exciting to me that our ability to create hardware and software that emulate human biological functions is increasing.
Does anyone know if this article is related to the recent optic nerve experiment where RAM was used as optical interfaces with nodes representing nerves that clustered as they were excited?