Announcement

Collapse
No announcement yet.

More interesting ineractions between automation and humans for safety of trasnport

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • More interesting ineractions between automation and humans for safety of trasnport

    Pseudo Final Report:



    Let's see if I have this right:

    You have an automatic emergency braking system that operates when the human is driving.

    But, when the computer is driving, the automatic emergency braking system is disabled, depending on the driver to do the braking?

    And- even though the emergency braking system is monitoring and processing conflicts, it does not notify the driver.

    Brill-Yunt! and cue the classic 1950s pilot disdain for scientific engineers.

    And, we see that fairly well trained pilots like Hui Thieu Lo and Ho Leigh Phouc confuse how all the different systems interact in this mode and that mode, and we expect taxi drivers to be able to handle it too.

    And, it should also be noted (for the zillionth time) that Hui Thieu Lo, Ho Leigh Phouc and Unnamed Uber Driver also failed at the basic requirements to monitor some basic crap like Airspeed and "The Road Ahead".
    Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

  • #2
    Originally posted by 3WE View Post
    And, we see that fairly well trained pilots like Hui Thieu Lo and Ho Leigh Phouc confuse how all the different systems interact in this mode and that mode, and we expect taxi drivers to be able to handle it too.
    Not all of us do...
    Be alert! America needs more lerts.

    Eric Law

    Comment


    • #3
      Originally posted by 3WE View Post
      And, we see that fairly well trained pilots like Hui Thieu Lo and Ho Leigh Phouc confuse how all the different systems interact in this mode and that mode.
      'fairly well-trained' pilots do not confuse how all the different systems interact in this mode and that mode. These guys were part of a culture that tends to skip over things.

      Comment


      • #4
        Originally posted by elaw View Post
        Not all of us do...
        Good one...nevertheless, you cannot ever 100% disconnect yourself from the proverbial Royal "we" (unless you move to a mountainous area and begin living off of the grid) .
        Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

        Comment


        • #5
          Originally posted by Evan View Post
          'fairly well-trained' pilots generally do not confuse how all the different systems interact in this mode and that mode. These guys were part of a culture that tends to skip over things.
          Not just fixed...

          BUT

          fixed significantly.

          Taken that bicycle ride yet?
          Les règles de l'aviation de base découragent de longues périodes de dur tirer vers le haut.

          Comment


          • #6
            3WE, the thing you still can't seem to grasp is that autopilot is not that inflatable guy named Otto in Airplane. It's a workload-reducing tool that requires pilots to aviate differently, by assigning modes, entering data and closely monitoring everything. You take the pilots out of the loop, it becomes a fairly distant future where absolute confidence can be placed in very secure technology.

            Now, the problem is, most drivers are thinking like you do. "Oh, boy, a 'self-driving' car! HAL's in charge. Think I'll climb in the back and take a little nap".

            So the essense of the problem is this misunderstanding: why are we allowing anyone to call these 'self-driving' cars? What we are really talking about here is an INTERFACE between human drivers and autopilot technology, and, like aviation, the danger lurks in the blended aspect of this automation. The AP is there to drive the car and the human is there to operate and monitor the AP. It reduces driver workload; it doesn't excuse them from driving altogether.

            Many of the AP-related crashes have resulted from pilots assuming the automation is relieving them of their piloting duties. And pilots are required to go through rather intense training compared to drivers. Therefore we can expect this problem to be epidemic if and when 'self-driving' cars are introduced on a widespread basis.

            And it all comes down to understanding what the autopilot is and what it isn't (and what it does and what it doesn't do).

            Meanwhile, car manufacturers are exhibiting their usual impatience in getting new features to market. Unless regulations prevent them from doing so, they will 'market-test' unproven tech while advertising it as 'self-driving' convenience.

            Fix #1: Prohibit car companies from using the term 'self-driving' until it really is safe to call them that.

            Comment


            • #7
              Originally posted by 3WE View Post
              Not just fixed...

              BUT

              fixed significantly.

              Taken that bicycle ride yet?
              No. Not fixed at all. If a line pilot doesn't have a complete understanding of ALL modes and ALL their interactions, that pilot is not even remotely 'well-trained'. That's like saying a pilot with a good understanding of the yoke and the thrust levers but a shady understanding of the rudder is 'well-trained', and as we have seen, that just isn't good enough.

              Comment


              • #8
                Originally posted by Evan View Post
                'fairly well-trained' pilots do not confuse how all the different systems interact in this mode and that mode.
                Ever?

                Ever ever?

                You do know that all absolute statements are wrong, right?
                Be alert! America needs more lerts.

                Eric Law

                Comment


                • #9
                  so evan, tell me something: what is the pilot supposed to do when full autoland is required? monitor the plane doing something based on what it sees and feels in a situation where the industry DOES NOT TRUST THE PILOT?

                  Comment


                  • #10
                    Originally posted by elaw View Post
                    Ever?

                    Ever ever?

                    You do know that all absolute statements are wrong, right?
                    I guess I just have a different definition of 'well-trained'. If a pilot is using an open mode like FLCH, which is speed-on-elevator, and has not entered a safe transition altitude (0 being 'not safe') and expects autothrust to kick in magically upon intercepting the glidepath, that pilot is not at all well-trained. That pilot has no business being in the cockpit. Certainly human factors can lead to confusion in certain stressful conditions, but proper training should instill certain core principals regarding automation that would be hard to forget even under such stress. Not impossible but very very very very unlikely.

                    Comment


                    • #11
                      Originally posted by TeeVee View Post
                      so evan, tell me something: what is the pilot supposed to do when full autoland is required? monitor the plane doing something based on what it sees and feels in a situation where the industry DOES NOT TRUST THE PILOT?
                      Monitor the instruments. Monitor the navigation. Monitor the autopilot itself (via the PFD). Be prepared to take over if anything feels wrong.

                      The industry only allows autoland if the pilot is rated (trusted) to perform it.

                      Comment


                      • #12
                        Originally posted by Evan View Post
                        I guess I just have a different definition of 'well-trained'.
                        I suspect your definition of "well trained" and mine aren't all that different. The difference lies in the expected outcome.

                        My feeling is that untrained individuals will frequently perform poorly at any given task. When training is added, and more training is added, the frequency of performing poorly drops, and approaches but never actually reaches zero. There are always other factors like fatigue, miscommunication, and whatever you want to call the condition of being half-asleep as a result of sitting in a seat doing pretty much nothing for hours, that will sometimes cause humans to perform poorly no matter how well they've been trained. The fact that you seemingly want to blame every pilot-error accident on poor training creates the appearance you think that sufficient training can produce people who perform correctly 100.000000% of the time. That's what I disagree with.

                        Originally posted by Evan View Post
                        Certainly human factors can lead to confusion in certain stressful conditions, but proper training should instill certain core principals regarding automation that would be hard to forget even under such stress. Not impossible but very very very very unlikely.
                        And that to me is the key principle. Looking at events where significant mistakes were made (like our beloved AF447), it's easy to forget that for each flight like that where something is done wrong, there are thousands or even millions of similar flights where the same things are done right. It's just that those flights don't get discussed here very often.
                        Be alert! America needs more lerts.

                        Eric Law

                        Comment


                        • #13
                          Evans dream come true! https://www.facebook.com/NowThisFutu...5085095865998/

                          Comment


                          • #14
                            Heh, I wonder if the kiddies that built that really think it's the first time a computer has landed an airplane.

                            They'd probably be disappointed to learn this one did the same in 1964:


                            Of course you have to give the robot guys some credit... their method is 100 times harder.
                            Be alert! America needs more lerts.

                            Eric Law

                            Comment


                            • #15
                              That is a computer, the other is a robot. BIG difference!

                              Comment

                              Working...
                              X