The concept of having conventionally automated functions in the work process as low authority artificial specialists of being good for a particular execution support but lacking the full perspective, was very convenient as long as only simple automated functions were used for simple tasks. It was of no harm that the responsibility load not to violate the prime goals of the work process was exclusively on the human operator's shoulders. However, in highly automated work systems for complex and demanding work objectives as we are used to now, like, for instance, a demanding mission of a modern fighter aircraft. Conventionally automated functions of high degree of complexity are accommodated as operationsupporting functions. They are considered as necessary to keep the human operator freed from a sufficiently high number of tasks which otherwise could lead to overloads. This might become critical, though, since with increasing conventional automation complexity, the task of the operator as supervisor becomes more and more complex, too. Instead of the intended unloading, danger of overcharge of the operator may inevitably arise in certain situations, resulting in a loss of situation awareness and control performance, i.e. lacking to comply with the prime goals.
The operator might be unaware of discrepancies between sub-task activities of automated functions or/and the prime goal necessities or even of his/her insufficiency to adapt to this inadequacy. This is just the opposite of what was intended by the introduction of automation. This is illustrating the principle effects for a pilot's work site, although not based on real data. In fact, experiences with conventional automation show that it reduces the demand on the pilot's resources on the average most of the time, but it might also generate, rarely though, excessive load which would not have happened without automation. Many accidents can be attributed to that phenomenon. Another study is talking about the erosion of operator responsibilities in this respect, but still sees the situation-dependent goal-setting as the responsibility of the human operator. Indeed, there has been taking place a steadily increasing encroachment of automation upon human work activities up to a very high degree of automation complexity, all as part of the operationsupporting means. Pilots of modern airliners have to deal with a proliferation of flight system modes. In order to illustrate the problem of mode awareness, due to the proliferation of modes which usually goes with intrinsic mode coupling, an example for an accident is given caused by this effect. The accident occurred at Nagoya (Japan) with an Airbus 300-600R of the China Airline in 1994.
In the past, system engineers misunderstood mishaps of that kind by vigorously pushing further increased use of this type of (conventional) automation, not carefully enough looking for possible negative effects on the work system as a whole. This fits well the characteristics of a vicious circle. As (historical) starting point of the vicious circle, work systems with none or only little automation might be taken. Here, manual control was the predominant mode of operation, while the supervision of automated systems could be neglected. However, as characteristic of reactive designs to inescapably occurring human error, automation and its further extention is taken as the solution. Designers transfer the authority over more and more comprehensive work process tasks from the operating force i.e. the human operator to increasingly complex automated functions being part of the operation-supporting means. Although, being relieved from workload in the first place, the human operator will be burdened with extra demands from increasing system monitoring and management tasks. At a certain point the human operator cannot but fail in the supervision of suchlike automated systems. In reaction, as being used to, typically the only solution considered is to further extend the share of automation as part of the operation supporting means in conjunction with further increase of complexity. In conclusion, as we are interested in a design with gains in productivity of a work system, we experience that the increase of system complexity due to further addition of conventional automation results in less and less increase of productivity gain and might eventually even end up with decreases.
Obviously, the limits of growth of productivity through increase of conventional automation are already attained in certain domains like aviation. Apparently, there were lacking well-founded methods and theoretical frameworks which thoroughly account for the overall performance demand on a work system as a whole (including the human operator). The FAA report was an attempt to promote endeavours in that direction. Unfortunately, the predominantly drawn conclusion, however, was to pursue only the probably simplest one to realise, i.e. to intensify the training of the human operator instead of intensifying research on a work system design with the potential to effectively counteract the deficiencies of conventional automation. Intensified training does not do any harm, but it cannot be very effective as to the deficiencies explicitly mentioned about implicit automation. The deficiencies of conventional automation as being experienced are well-captured in explicit terms in the meantime by the phenomena known under automation
• opacity (entailing increased demands on monitoring the automation status)
• clumsiness, and
• data overload, all of them being consequences of too high system complexity.
Brittleness is describing inadequacies of conventional automation due to the fact that it is almost impossible to verify during the development process of highly complex functions that everything is working allright in all possibly encountered situations. Therefore, brittleness refers to the characteristic feature of conventional automation to perform well, i.e. according to specification, within clear-cut operational limits, but close or even beyond these limits usually quit service rather rapidly. These limitations of proper operation are usually not known by the human operator and there will always be a situation number "n+1" which could not be foreseen during the design process. Opacity typically refers to situations when the human operator gets surprised by what turns out as the output of the automated function, although there is no system error. The human operator simply cannot know all about the complex functionality he is tying to use and has no chance to understand what is going on. Literalism simply is the kind of inflexibility of a computer program of conventional automation which stubbornly carries out how it is instructed by the human operator. It does not necessarily account for the objectively given demands in the context of certain situations in order to comply with the pertinent work objective. In other words, literalism describes the character of a conventionally automated system, strictly to follow the instructions given by the human operator no matter whether they are correct or might be wrong. Conventional automation does not question or check control operations concerning their ability to make sense in the given context. Finally, clumsiness stands for the type of automation which decreases the operator's workload in situations when it is anyway low, and which does not help or even increases the operator's workload when it is already high.
This mirrors the fact that conventional automation usually does not help in unanticipated situations when it is urgently needed. A cause for clumsiness can also be the presentation of excessive amounts of information to the human operator, overloading him with data which are only partially needed. Clumsiness can also be attributed to cases which lead to behavioural traits of the human operator to "trick" the automated function concerned. The automatic system calculates an optimal point when to start the descent for the landing approach. The algorithm often denies the pilot's desire of a smooth approach starting at a point where the plane is "not too high and not too fast". This causes the pilot to insert wind data as important input for the algorithm which indicate a higher tailwind component than what is really communicated such that the start of the descent is calculated to be at an earlier point of time.
Our website is not responsible for the information contained by this article. Articleinput.com is a free articles resource thus practically any visitor can submit an article. However if you notice any copyrighted material, please contact us and we will remove the article(s) in discussion right away.
Note: This article was sent to us by: Jonathan Todd at 01152010
1. The basic work site settings are not part of the productive work process
© 2009 ArticleInput.com.