Page 1 of 2 12 LastLast
Results 1 to 20 of 29

Thread: More interesting ineractions between automation and humans for safety of trasnport

  1. #1
    Senior Member 3WE's Avatar
    Join Date
    Jan 2008
    Posts
    4,365

    Default More interesting ineractions between automation and humans for safety of trasnport

    Pseudo Final Report:

    https://www.yahoo.com/finance/news/f...132704038.html

    Let's see if I have this right:

    You have an automatic emergency braking system that operates when the human is driving.

    But, when the computer is driving, the automatic emergency braking system is disabled, depending on the driver to do the braking?

    And- even though the emergency braking system is monitoring and processing conflicts, it does not notify the driver.

    Brill-Yunt! and cue the classic 1950s pilot disdain for scientific engineers.

    And, we see that fairly well trained pilots like Hui Thieu Lo and Ho Leigh Phouc confuse how all the different systems interact in this mode and that mode, and we expect taxi drivers to be able to handle it too.

    And, it should also be noted (for the zillionth time) that Hui Thieu Lo, Ho Leigh Phouc and Unnamed Uber Driver also failed at the basic requirements to monitor some basic crap like Airspeed and "The Road Ahead".
    Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

  2. #2
    Member
    Join Date
    Jan 2008
    Location
    MA, USA
    Posts
    750

    Default

    Quote Originally Posted by 3WE View Post
    And, we see that fairly well trained pilots like Hui Thieu Lo and Ho Leigh Phouc confuse how all the different systems interact in this mode and that mode, and we expect taxi drivers to be able to handle it too.
    Not all of us do...
    Be alert! America needs more lerts.

    Eric Law

  3. #3
    Senior Member Evan's Avatar
    Join Date
    Jan 2008
    Posts
    5,663

    Default

    Quote Originally Posted by 3WE View Post
    And, we see that fairly well trained pilots like Hui Thieu Lo and Ho Leigh Phouc confuse how all the different systems interact in this mode and that mode.
    'fairly well-trained' pilots do not confuse how all the different systems interact in this mode and that mode. These guys were part of a culture that tends to skip over things.

  4. #4
    Senior Member 3WE's Avatar
    Join Date
    Jan 2008
    Posts
    4,365

    Default

    Quote Originally Posted by elaw View Post
    Not all of us do...
    Good one...nevertheless, you cannot ever 100% disconnect yourself from the proverbial Royal "we" (unless you move to a mountainous area and begin living off of the grid) .
    Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

  5. #5
    Senior Member 3WE's Avatar
    Join Date
    Jan 2008
    Posts
    4,365

    Default

    Quote Originally Posted by Evan View Post
    'fairly well-trained' pilots generally do not confuse how all the different systems interact in this mode and that mode. These guys were part of a culture that tends to skip over things.
    Not just fixed...

    BUT

    fixed significantly.

    Taken that bicycle ride yet?
    Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

  6. #6
    Senior Member Evan's Avatar
    Join Date
    Jan 2008
    Posts
    5,663

    Default

    3WE, the thing you still can't seem to grasp is that autopilot is not that inflatable guy named Otto in Airplane. It's a workload-reducing tool that requires pilots to aviate differently, by assigning modes, entering data and closely monitoring everything. You take the pilots out of the loop, it becomes a fairly distant future where absolute confidence can be placed in very secure technology.

    Now, the problem is, most drivers are thinking like you do. "Oh, boy, a 'self-driving' car! HAL's in charge. Think I'll climb in the back and take a little nap".

    So the essense of the problem is this misunderstanding: why are we allowing anyone to call these 'self-driving' cars? What we are really talking about here is an INTERFACE between human drivers and autopilot technology, and, like aviation, the danger lurks in the blended aspect of this automation. The AP is there to drive the car and the human is there to operate and monitor the AP. It reduces driver workload; it doesn't excuse them from driving altogether.

    Many of the AP-related crashes have resulted from pilots assuming the automation is relieving them of their piloting duties. And pilots are required to go through rather intense training compared to drivers. Therefore we can expect this problem to be epidemic if and when 'self-driving' cars are introduced on a widespread basis.

    And it all comes down to understanding what the autopilot is and what it isn't (and what it does and what it doesn't do).

    Meanwhile, car manufacturers are exhibiting their usual impatience in getting new features to market. Unless regulations prevent them from doing so, they will 'market-test' unproven tech while advertising it as 'self-driving' convenience.

    Fix #1: Prohibit car companies from using the term 'self-driving' until it really is safe to call them that.

  7. #7
    Senior Member Evan's Avatar
    Join Date
    Jan 2008
    Posts
    5,663

    Default

    Quote Originally Posted by 3WE View Post
    Not just fixed...

    BUT

    fixed significantly.

    Taken that bicycle ride yet?
    No. Not fixed at all. If a line pilot doesn't have a complete understanding of ALL modes and ALL their interactions, that pilot is not even remotely 'well-trained'. That's like saying a pilot with a good understanding of the yoke and the thrust levers but a shady understanding of the rudder is 'well-trained', and as we have seen, that just isn't good enough.

  8. #8
    Member
    Join Date
    Jan 2008
    Location
    MA, USA
    Posts
    750

    Default

    Quote Originally Posted by Evan View Post
    'fairly well-trained' pilots do not confuse how all the different systems interact in this mode and that mode.
    Ever?

    Ever ever?

    You do know that all absolute statements are wrong, right?
    Be alert! America needs more lerts.

    Eric Law

  9. #9
    Senior Member TeeVee's Avatar
    Join Date
    Mar 2009
    Location
    MIA
    Posts
    1,860

    Default

    so evan, tell me something: what is the pilot supposed to do when full autoland is required? monitor the plane doing something based on what it sees and feels in a situation where the industry DOES NOT TRUST THE PILOT?

  10. #10
    Senior Member Evan's Avatar
    Join Date
    Jan 2008
    Posts
    5,663

    Default

    Quote Originally Posted by elaw View Post
    Ever?

    Ever ever?

    You do know that all absolute statements are wrong, right?
    I guess I just have a different definition of 'well-trained'. If a pilot is using an open mode like FLCH, which is speed-on-elevator, and has not entered a safe transition altitude (0 being 'not safe') and expects autothrust to kick in magically upon intercepting the glidepath, that pilot is not at all well-trained. That pilot has no business being in the cockpit. Certainly human factors can lead to confusion in certain stressful conditions, but proper training should instill certain core principals regarding automation that would be hard to forget even under such stress. Not impossible but very very very very unlikely.

  11. #11
    Senior Member Evan's Avatar
    Join Date
    Jan 2008
    Posts
    5,663

    Default

    Quote Originally Posted by TeeVee View Post
    so evan, tell me something: what is the pilot supposed to do when full autoland is required? monitor the plane doing something based on what it sees and feels in a situation where the industry DOES NOT TRUST THE PILOT?
    Monitor the instruments. Monitor the navigation. Monitor the autopilot itself (via the PFD). Be prepared to take over if anything feels wrong.

    The industry only allows autoland if the pilot is rated (trusted) to perform it.

  12. #12
    Member
    Join Date
    Jan 2008
    Location
    MA, USA
    Posts
    750

    Default

    Quote Originally Posted by Evan View Post
    I guess I just have a different definition of 'well-trained'.
    I suspect your definition of "well trained" and mine aren't all that different. The difference lies in the expected outcome.

    My feeling is that untrained individuals will frequently perform poorly at any given task. When training is added, and more training is added, the frequency of performing poorly drops, and approaches but never actually reaches zero. There are always other factors like fatigue, miscommunication, and whatever you want to call the condition of being half-asleep as a result of sitting in a seat doing pretty much nothing for hours, that will sometimes cause humans to perform poorly no matter how well they've been trained. The fact that you seemingly want to blame every pilot-error accident on poor training creates the appearance you think that sufficient training can produce people who perform correctly 100.000000% of the time. That's what I disagree with.

    Quote Originally Posted by Evan View Post
    Certainly human factors can lead to confusion in certain stressful conditions, but proper training should instill certain core principals regarding automation that would be hard to forget even under such stress. Not impossible but very very very very unlikely.
    And that to me is the key principle. Looking at events where significant mistakes were made (like our beloved AF447), it's easy to forget that for each flight like that where something is done wrong, there are thousands or even millions of similar flights where the same things are done right. It's just that those flights don't get discussed here very often.
    Be alert! America needs more lerts.

    Eric Law

  13. #13
    Senior Member BoeingBobby's Avatar
    Join Date
    Jun 2009
    Location
    MIA
    Posts
    1,056

  14. #14
    Member
    Join Date
    Jan 2008
    Location
    MA, USA
    Posts
    750

    Default

    Heh, I wonder if the kiddies that built that really think it's the first time a computer has landed an airplane.

    They'd probably be disappointed to learn this one did the same in 1964:


    Of course you have to give the robot guys some credit... their method is 100 times harder.
    Be alert! America needs more lerts.

    Eric Law

  15. #15
    Senior Member BoeingBobby's Avatar
    Join Date
    Jun 2009
    Location
    MIA
    Posts
    1,056

    Default

    That is a computer, the other is a robot. BIG difference!

  16. #16
    Senior Member Evan's Avatar
    Join Date
    Jan 2008
    Posts
    5,663

    Default

    Quote Originally Posted by elaw View Post
    I suspect your definition of "well trained" and mine aren't all that different. The difference lies in the expected outcome.

    My feeling is that untrained individuals will frequently perform poorly at any given task. When training is added, and more training is added, the frequency of performing poorly drops, and approaches but never actually reaches zero. There are always other factors like fatigue, miscommunication, and whatever you want to call the condition of being half-asleep as a result of sitting in a seat doing pretty much nothing for hours, that will sometimes cause humans to perform poorly no matter how well they've been trained.
    But we are talking about core concepts here. When a pilot selects an autopilot mode, it is chosen with a strategy in mind. This mode does this. That mode does that. It's like if a chess player tried to use a bishop to move sideways instead of diagonally. I've seen a lot of fatigued chess players make obvious errors, but I've never seen one forget the concepts of which piece does what. If he did, I would hesitate to call him well-trained.


    Quote Originally Posted by elaw View Post
    The fact that you seemingly want to blame every pilot-error accident on poor training creates the appearance you think that sufficient training can produce people who perform correctly 100.000000% of the time. That's what I disagree with.
    Whoa there, once again the word fact is thrown in recklessly. I never said any such thing. In fact I'm the one always emphasizing the humbling effects of human factors. I am aware, however, that training is the best defense against them.

    3WE wants to believe that the SFO Asiana crash was the result of too much baffling technology. It was the result of inadequate training on very understandable automation.

  17. #17
    Senior Member 3WE's Avatar
    Join Date
    Jan 2008
    Posts
    4,365

    Default

    Quote Originally Posted by Evan View Post
    3WE wants to believe that the SFO Asiana crash was the result of too much baffling technology. It was the result of inadequate training on very understandable automation.
    Not exactly, but just for the record, the inflatable dude in Airplane! is named "Otto"...as in "Otto Pilot". Maybe ride the bike down to the grocery store and get Airplane! out of the Red Box. There may be hand sanitizer available inside the store if you need it.

    Your selective memory brushes over that Hui Theiu Lo expressed that he was scared of hand landing an airplane on a nice long runway on a beautiful afternoon with light winds- AND that always being aware of airspeed- ESPECIALLY on short final is important for Cessna 150s and Boeing 777s...Does the emphasis on automation have anything to do with that?
    Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

  18. #18
    Senior Member Evan's Avatar
    Join Date
    Jan 2008
    Posts
    5,663

    Default

    Quote Originally Posted by 3WE View Post
    Your selective memory brushes over that Hui Theiu Lo expressed that he was scared of hand landing an airplane on a nice long runway on a beautiful afternoon with light winds- AND that always being aware of airspeed- ESPECIALLY on short final is important for Cessna 150s and Boeing 777s...Does the emphasis on automation have anything to do with that?
    Yes. He wasn't well-trained. Not to hand fly. Not to fly on automation. Not to follow the simplest cardinal rule of keeping a hand on the thrust levers on final and thus have full awareness of the autothrust behavior. Not to monitor the PFD annunciations. Certainly not on modal interactions. He was advised on airspeed. He just didn't think he had to do anything about it.

    But you started a thread on automation, so don't bring hand flying into it.

  19. #19
    Senior Member Gabriel's Avatar
    Join Date
    Jan 2008
    Location
    Buenos Aires - Argentina
    Posts
    6,129

    Default

    Pst, Evan. It was you who said "3WE wants to believe that the SFO Asiana crash was the result of too much baffling technology".
    3WE is just explaining that that's not the case, that he believes that the Asiana crash was the result of too poor airmanship in general. And, for whatever is worth, I agree. No "and what is it doing now" can bit a "click click, clack clack".

    --- Judge what is said by the merits of what is said, not by the credentials of who said it. ---
    --- Defend what you say with arguments, not by imposing your credentials ---

  20. #20
    Senior Member 3WE's Avatar
    Join Date
    Jan 2008
    Posts
    4,365

    Default

    Quote Originally Posted by Evan View Post
    But you started a thread on automation, so don't bring hand flying into it.
    Um, no the title of this thread is not as black and white as your mind (yet again) makes it.

    It is on the interaction of automation and fundamentals.

    Madam Uber driver was supposed to watch the road ahead and brake. Hui Theiu Lo was supposed to monitor airspeed...

    And for the umpteenth time- I have very few hours- and yet have been taught to watch airspeed on short final (regardless of type). And fact- Hui Theiu Lo had a huge shit pot of training on automation. I promise- I really doubt I could start a 777, nor program it's FMS. I guess I can go on MSFS and a Youtube or two, I might be able to turn on "Otto", but its too simplistic for you to simply say Hui Theiu Lo was poorly trained...

    And I actually agree- one thing, apparently lacking in his training is to watch your airspeed on short final and keep your hands on the power levels...Again- that's type specific for 150, 152, and 172M 172S and 172P models...not sure if it counts on a Triple 7 or Bobby's 74'.....

    It's difficult to reconcile that 3BS, with 100 hours, knows to watch your airspeed. Hui Thieu Lo...with thousands of hours, sim time, classroom and meh....Otto's watching my speed....
    Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •