"The thing with this crash is it's one of the simpler situations that a self driving car should be able to deal with, yet it failed so spectacularly"
A large part of this is cultural norms. I've written about this in detail elsewhere.
The short version is: 'A century of lobbying has made the USA a country which is extremely pedestrian hostile, with road rules that essentially prohibit pedestrians on the road in most states and dictate "the pedestrian shall give way to the vehicle"
That gives rise to an assumption that pedestrians won't be on the road except where authorised, which in turn leads to programming assumptions that they don't need to be scanned for and taken into account.'
This is how one creates a fleet of robotic killing machines. No malice needed, just no notice taken of obstacles. It could have been a pedestrian, or just as easily it could have been a cow - in which case we'd be reading about an Uber driver and passenger being killed.
The USA authorities have compounded the problem by blaming the victim for crossing the road - something that in most countries is perfectly legal and she did so perfectly safely. The instruments will have picked her up quite well, but the ROBOT failed to react. This is a programming fuck up and the best thing that can happen right now is for insurance companies to decline to cover any self-driving vehicle developers unless they can demonstrate their abilities to cope with these most basic of scenarios BEFORE they're let loose on public roads.
(I'm thinking that the obstacle course from "Britain's worst drivers" would be a good _starting_ point for them to pass.)