View Single Post
Old Mar 21st, 2019, 18:10   #6
ergonomist
Member
 

Last Online: May 10th, 2024 16:12
Join Date: Apr 2016
Location: Reading
Default

Quote:
Originally Posted by Wizzpop View Post
Understand the philosophy, but I am a little dubious. The accidents with the Boeing 737 M8 were apparently due the aircraft sensors and software countermanding the human pilots to the extent they couldn't regain control.
At the risk both of going wildly off-topic and also speculating solely on the basis of what's in the media, I feel that the issue with the 737 Max was the way automation was implemented rather than with automation itself. A system classified as 'hazardous' should never have been designed to follow a single sensor input (there were two available on the aircraft which could have been compared). Also, Boeing appears to have decided that no operator intervention/overide would ever be required if the system failed (hence why they excluded any mention of MCAS from transition training or the manual).

It's this last point that should be the focus of any successful introduction of automation (whether aircraft or cars) - what are the potential failure modes for the automated system? What level of operator (driver) intervention might be required? Is it reasonable to expect a driver to provide such intervention?

In another thread there has been discussion of driver monitoring of level 2/level 3 autonomous systems. The human factors issues around monitoring automation are very significant... I'd be happy with the Volvo drugs and alcohol nanny provided I was assured about the rigour of the safety assessments that underpinned it.
__________________
XC90 D5 Inscription MY16 - white + blond/charcoal, and got carried away with packs and options...
ergonomist is offline   Reply With Quote
The Following User Says Thank You to ergonomist For This Useful Post: