sure, but this is a fucking gimmick "experiment".
the algo could be really simple too.
and for developing said algorithm, no actual robots are necessary at all - except for showing to journos, no actual AI researchers would find that part necessary, the testing can happen entirely in simulation - and no actual ethics need to enter the picture even, the robot doesn't need to understand what a human is on the level a robot that would need to in order to act by asimovs laws.
a spinning blade cutting tool that has an automatic emergency brake isn't sentient- it's not acting on asimovs laws, but you could claim so to some journalists anyways.. the thing to take home is that they built into the algorithm the ability to fret over the situation. if it just projected and saved what can be saved, it wouldn't fret or hesitate - and hesitate is really the wrong word.