PERHAPS I'M READING INTO THIS ... but the first thing that came to mind while reading this article was how open source in AI would open up transparency or accountability to the developer if such an AI did something catastrophic. The code could be examined for ill-intent or even shady routines thereby exposing the developer or the company responsible. Closed source seems to alleviate transparency and accountability, or at least slowing down any investigation with reverse engineering. I do understand, however, that protecting one's code from being copied and open sourced should also remain a right of the owner. Maybe I'm totally misreading this article, but couldn't waste the thought.