Quoth the article:
scientists stimulated one nerve cell to communicate with a second cell which transmitted that signal to multiple cells within the network.
Singal up (probably down too, though that is not said). That's a start. Now let me jump.
Imagine how this would feel in your own brain. Even strengthened to noticeable level by a lump of neurons, the signal would still read "beep". Now imagine being fed information through that channel. "Beep, bip beep bip bip beep". Better start training that morse.
Now let's enhance the input by adding more bits into it and running data through a digital-to-analog converter. This is where you would slowly be able to "see colors", one at a time. Low signal, cold feeling; high signal, hot feeling. That is brainable information. You can associate different patterns of these "colors" to different ideas.
But still it's not like you could see any shapes, is it?
Now add more bytes, feed them in side-by-side. That's a feed. At this point, feel nausea. Something is feeding noise into your thoughts, something you cannot possibly comprehend.
Would take a processing system not unlike vision inside the brain to translate that feed into experiences like colors, tastes, touches, then further associate these to make shapes out of the noise.
A long way.
Worth taking, of course, as research goes, but I wouldn't toss away those external displays as of yet. Have a hunch computers won't be the same, either, when we get there.
Future research will focus on interfacing silicon chips with the human brain to control artificial limbs and develop "thinking" computers.