The article summary isn't very good. If the software is programmed in a way that causes a car to behave in a way that's dangerous, it IS the software's fault.
No, it is the programmer's fault. Software is an amoral set of machine instructions written by a human. Saying it is the software's fault is like blaming a press for cutting off someone's hand. The actual fault is either user error or faulty machine design. The machine is just doing what it was told so blaming software is misplaced. The fact that the problem of autonomous driving has a lot of difficult and dangerous corner cases is irrelevant.
Software is just a set of instructions given by a human so if the instructions are wrong it is the fault of the person who gave the instructions to the machine. If the car behaves in a way that is dangerous then it is the fault of the person/company/entity that wrote the software controlling that behavior. The programmer is just as much at fault as a driver who made a mistake. The difference is that the programmer probably isn't sitting in the car at the time but just like the driver he/she was the one in charge of the driving of the car. It's really easy to forget that a human is instructing the machine - the difference is that the instructions are time shifted.
I think in time autonomous cars could prove to be safer than many/most human driven ones. There are a LOT of really bad drivers on the road and the tests to qualify for a license are pretty much a joke and there are almost no requirements to re-qualify which is kind of nuts. But the liability for bad or inadequate software should fall squarely on the party that wrote said software.