Monday, July 2, 2012

Business Analysis and Process Design

Process Design Considerations

In this blog we look at some of the effects that might occur as a result of an analyst's re-design of processes. This section is important as it provides a broad view of some hidden dangers that a novice analyst may easily overlook. These impacts are, again social, organisational and specific to the operation of the task being modified. We begin with social impacts.

Democratisation of the Workplace

Rochlin (1997) suggests that ready access to information may empower some individuals to make better decisions but that this does not necessary extend to a social context in which centralised coordination may have gained in power. Within organisations Rochlin (1997) questions the arguments of technology promoters that both workers and managers have better information, more autonomy and greater control. Instead Rochlin (1997) argues that with any new technology there is an initial democratising effect, but once this transient phase passes both workers and managers find their descretion and autonomy reduced whilst their workloads increase and they face acquiring and maintaining a more complex body of knowledge. He argues that workers seeking autonomy either forego benefits available to other workers, or are bound by new rules that are strict in relation to their domain of "autonomous" behaviour. Increasingly, according to Rochlin (1997), these rules are imposed by people who are more skilled in computers than in the practices they are designing, (business analysts?) and often these designers are even external to the organisation.

Attempts to defend the role of human experience and integrative judgement are confronting "a new breed of neo-Taylorists who seek to automate everything in sight in the name of reliability, efficiency, and progress" (Rochlin 1997, p 10). However, Rochlin (1997) warns that the outcome of redesigning complex task environments is difficult to predict. If introduced carefully and with sensitivity (see the XP development principles in Lesson 6 for example) human capabilities can be augmented, particularly in stable environments where people have time to learn to incorporate the new approaches into their cognitive frames. In such cases, people can become comfortable and expert in the new environment, whether work or social (Rochlin 1997). However, often things do not stay stable. Particularly as computers allow high degrees of interconnectivity and complexity, both within organisations and to the political and social environments outside (eg: Facebook?). However, Rochlin (1997) warns:

"because computers are inherent promoters of efficiency, and because people who run the organisations and activities in question are constantly seeking to improve it, the tendency to tighten coupling, to reduce slack and time of response, also tends to increase (Rochlin, 1997, p 11)."

These concepts of slack and coupling in relation to process design are important, and we will return to them later in the article.

Processes of Design

Rochlin (1997) cites Landauer (1995) in relation to one problem of software design (p 33):
"Unfortunately, the overwhelming majority of computer applications are designed and developed solely by computer programmers who know next to nothing about the work that is going to be done with the aid of their program. Programmers rarely have contact with users; they almost never test the system with users before release ... So where do new software designs come from? The source is almost entirely supply-side, or technology push."
This instantly raises a number of questions. One is how true is this over a decade later? We look in a minute at the imbalances between users and designers which predominantly remain today. However, it is likely that Agile and XP approaches are largely directed at this problem of involving the user. Consider also how much design is done by Google and Facebook without involvement of users outside of the designing community. Another question that arises is whether this problem is overcome by having business analysts who can take a broader view of the problems being addressed. This may help widen the focus of problem solving somewhat, but there is no guarantee that business analysts do not, as designers, suffer the same short-comings as the computer programmers described above. One reason for this is that, according to Rochlin (1997) designers and users come from different communities, work in different environments and with different organisational contexts. Rochlin (1997) also argues that designers tend to have a privileged position in relation to dialogue, whereas users can provide input and give advice, but only from a subordinate position. A critical issue that affects operability, reliability and performance is whether or not users have real control to allow new techniques to be introduced or not, rather than just some control over how they are introduced.

Rochlin (1997) later considers how technology has progressively separated people from control over their work. The first stage of this was the introduction of production lines which transformed craftspeople into machine operators. The second stage was computerisation of production lines, which shifted people back another step, now they do not even operate the machine that manufactures anymore, but simply supervise the controls of the computer which now does this. Thus workers are no longer skilled in a variety of areas of production, but rather in standardised aspects of control. Rochlin (1997) argues that this separation of people from their work now extends even to knowledge-based work. Managers and supervisors often do not manage and supervise, but operate computers that manage and supervise. This leads to new forms of technical systems whose risks are not yet understood. Interestingly, Rochlin (1997) devoted two chapters to the risks associated of the use of such systems for computer trading and in finance, this of course prior to the Global Financial Crisis (GFC).

Rochlin (1997)'s explanation of the consequences of computerisation may offer some insights as to why productivity may not increase:
"to compensate for the loss of broadly skilled workers with integrative knowledge of plant and process, more elaborate and detailed models of work and task were created; control was asserted by increasing the direct, line authority of foremen and other intermediate managerial classes, under the assumption that better data and information would continue to ensure smooth integration. As the transition from skilled to standarised labor proceeded, more and more oversight was demanded for process reliability, which in turn increased organisational complexity and the need for managerial coordination. Despite its absence from traditional measures of productivity, the extensive bureaucratization, with the attendant increase in office and managerial staff it required, was now seen as necessary and productive, rather than as an expensive and wasteful consumer of resources (p 57)".
Rochlin (1997) argues that this lead to the replacement of workers who had detailed knowledge of their fields with professional engineers who had knowledge of general theories as well as management practices and objectives. Associated with this was a shift in importance from workers to plant and machinery. This in turn elevated the status of those responsible for the formal design and organisation of plant flows.

Braverman (1975) also discussed this theme, but argued that while technology deskilled workers, the main problem was not the changing relation between workers and their machinery but rather the increased control by management being made in the name of technical efficiency. Rochlin (1997) suggest that it is this control of the workplace that threatens white-collar workers more so than intelligent machines or office automation. This threat was also discussed by Ford (2009) as covered in an earlier blog.

Rochlin (1997) suggests that technology has changed hierarchical workplaces. Traditionally, middle managers would collect, process and forward information up the chain. But now managers at higher levels cannot only oversee the work of subordinates, at any level, but also monitor them. This gives them unprecendented opportunities to micro-manage, or otherwise interfere in, the work of those subject to their authority, often in relation to tasks and processes for which they have no expertise or formal training and where blame for mistakes will fall on others. Rochlin (1997) refers to research that revealed that plant managers overwhelmingly reported a desire for a central control screen in which they could run the entire plant. A similar study found that information systems in organisations were being used to produce systems with centralised knowledge and top-down control along the ideals of Taylor's 'scientific management'. Rochlin (1997) fears that this will lead to operators that appear to be autonomous, but will be actually working in jobs that are so bounded and circumscribed that they have no room for skill development or discretion. Managers may find themselves in similar situations unable to deviate from the plans and programs of the organisation. Rochlin (1997) quotes Zuboff in relation to this threat to workers:

"Post-industrial technology threatens to throw them out of production, making them into dial watchers without function or purpose"

However, there is a related problem here, called "defunctionalisation" which concerns the loss of skills and expertise.

Expertise, Skills and Automation

In discussing expertise and skills, Rochlin (1997) distinguishes between someone who is proficient and someone who is an expert. Proficiency can be gained by rational and reductionist approaches whereby people are trained to follow logically deductive chains (i.e rote learning or practice). But such learning does not produce an expert; this requires discretion and trial-and-error experience. The result is more of an integrated representation of knowledge (tacit knowledge) rather than a series of causally linked steps (Rochlin, 1997). There are vast differences then between the abilities of an expert when compared with someone who is proficient, and we will return to this shortly.

Operational divisions of plants often regard themselves as repositories of expert knowledge of the systems they work with. They have experience with the actual working of the plant rather than the more formal understandings that engineers and managers get from specifications, rules and procedures. And while the knowledge of engineers is respected, operators can worry about interference from people who have no hands-on or practical experience. Concerns are particularly raised when professional consultants are used to improve performance or reliability at the human-machine interface (Rochlin 1997).

Rochlin (1997) describes the opinions of nuclear plant operators as:

"the 'beards' come in here, look around for an hour or two, and then go back and write up all these changes to the controls. Because it will cost the company a fortune, the decisions are going to be made upstairs (i.e by managers and professional engineers). We'll have to argue for hours and hours about changes that will interfere with the way we work. And sometimes they make them anyway" (pg 110).

These are important considerations when you are concerned with air traffic control centres, utility grids, and military combat centres. Rochlin (1997) cites studies where 90% of pilots felt that the logic of designers was substantially different to that pilots and other users.

Criticism of designers who deliberately limit what pilots can make aircraft do has lead to this type of phenomenon being called "glass cockpit syndrome". This term has been more widely adopted to apply more generally to situations where human operators are separated from direct, tactile, modes of control and instead placed in automated control rooms where computerised displays are their only sensory inputs (Rochlin, 1997). Glass cockpit syndrome has been reported not only by pilots and air traffic controllers, but also nuclear plant operators and other similarly hazardous, but still complex, systems (Rochlin, 1997). One source of difficulty is that the technical constructs integrate technical, process and situational complexity into a single spatio-temporal image. Another source of concern comes from observations of teamwork where cultures of safety have been based on the "silent" practices of mutual manual monitoring and tacit task allocation. The effects of automation on these interactions may be unclear, and how these functions might be re-allocated if the human operators have to step in and retake control of the system, as may happen in an emergency (Rochlin, 1997). A third concern, is that the introduction of automated control allows systems to deal with more traffic; levels of traffic that cannot be controlled without the automated systems. Air-traffic control is an example here. Operators at many airports are capable of managing the system manually if computers go down using their manual slips of paper that they maintain alongside the automated tools. Automation threatens to allow higher densities of traffic and numbers of airplanes which could not be managed manually. Actually, air traffic controllers have been identified as a unique case, due to their access to decision makers and the dependence of those decision makers on the controllers (as most powerful people are regular users of airports). It has been suggested that because of these factors air-traffic controllers have been able to resist design changes to their work environment that they consider dangerous or detrimental (Rochlin, 1997).


Human concerns about "Glass Cockpits" (Rochlin, 1997). Aviation accident investigations have linked the last three of these to new categories of errors and mistakes, some with fatal consequences.
  • Too much workload associated with re-programming flight management systems.
  • Too much heads-down time in the cockpit attending to the systems
  • Deterioration of flying skills because of over-reliance on automation
  • Increasing complacency, lack of vigilance, and boredom
  • Lack of situational awareness when automated systems fail, making it difficult to identify and correct problems
  • Reluctance to take over from automated systems, even in the face of compelling evidence that somethings is wrong.


However, few groups have the sort of control over their environment that air traffic controllers enjoy. Nuclear and chemical plant operators and many others, face automation and lack the public access and visibility to challenge changes to their work environment (Rochlin, 1997).

Their concerns are that with more automation old expertise will be lost as new staff are trained more in computer and management skills at the expense of developing a deeper knowledge of the systems they are operating. It is this deeper knowledge that comes from real experience with controlling the systems. In fact, there are fears that automating the 'easy' tasks of system control will actually make it harder for operators to control the system when the 'hard' problems arise (Rochlin, 1997). Rochlin (1997) describes the instincts that are developed from coal-face experience of controlling systems directly rather than operating a computer control:
"This was brought home to me quite sharply when I was inteviewing in a nuclear power plant control room, and heard an operator say that he "did not like the way that pump sounded" when it started up. Although the instrumentation showed no malfunction, the pump was stripped down at the operator's recommendation, at which point it was found that one of the bearings was near failure. My research notes (and those of others doing similar work) contain dozens of similar stories, ranging from detection of the onset of mechanical failures to air traffic controllers intervening in an apparently calm situation because they did not like the way the 'pattern' of traffic was developing" (pg 124).
The possible consequences of automation are explain by Rochlin (1997) as follows:
"Human learning takes place through action. Trial-and-error defines limits, but its complement, trial-and-success, is what builds judgment and confidence. to not be allowed to err is to not be allowed to learn; to not be allowed to try at all is to be deprived of the motivation to learn. This seems a poor way to train a human being who is supposed to act intelligently and correctly when the automated system fails or breaks down - that is, in a situation that comes predefined as requiring experience, judgment, and confidence as a guide to action" (pg 126).
Rochlin (1997) continues on to suggest that computerised control systems could be designed sensitively and interactively to support both safety and performance. However, he argues that this is not what usually happens. What does happen is that computer implementations sooner or later lead organisations to try and maximise efficiency. This reduces the margin of time in which a human could assess situations so as to take appropriate action in the case of problems. Thus human oversight of such systems is effectively useless. The engineering solution to such a risk is to provide additional redundancy by providing systems that operate in parallel for monitoring and control tasks. But Rochlin (1997) suggests that redundancy still does not provide that essential resource necessary to guard against failure: slack. Slack in the sense of a small excess margin that leaves resources available to deal with problems. In the opinion of the author (M. Mitchell) slack is critical issue in not only control systems, but also organisation design, as it is slack that allows degrees of variation and adaptation in times of difficulty. Note, that in such times it is human experience and judgement that comes in to play, thus the value of having staff who have experience beyond just operating machines, or following pre-determined processes.

C3I

Rochlin (1997) distinguishes clearly between command and control. Contrary to what is suggested in much managment literature, these two terms are not synonymous. Control involves feedback mechanisms which allow learning in relation to a specific purpose. It is suited for situations that are deterministic and relatively certain (eg: thermostat control in a house). Command, on the other hand, draws on learning from a wide range of circumstances associated with various purposes. Command is used where is there is significant uncertainty and draws on a much broader set of experience combined with heuristics (rules of thumb) and/or intuition.

Rochlin (1997) believes that technocrats often mistakenly try and treat command problems as though they were control problems (thus treating command-and-control as a single indivisable term). The belief of these technocrats seems to be that the increasingly complex operational environments - which emerge from having mutiple missions, highly differentiated and specialised units, complex bureacracies, etc - can be controlled using information systems involving increasingly complex models and integrated networks. Rochlin (1997) argues that this is a mistake. They would be better off to accept the "necessity to cope with the irreducable increase in uncertainty" (pg 189) bought about by the factors above. Instead what is required is people who are able to make decisions based on partial knowledge and information and correct on the fly through processes of trial and error (this links back to his point in an earlier section whereby people develop this ability by direct 'real' experience, not by operating machines). Instead of this in modern business environments power has been transferred from people with experience that is tacit and difficult to quantify to people with data and model maniputation skills. This change has been coupled with a corporate culture "where quantitative analysis and esoteric computer skills were becoming increasingly valued" (pg 190) (Rochlin, 1997). He argues that a similar change has occurred in the military; flexible and adaptive control by people high in the hierarchy has been fostered (based on the availability of information) rather than allowing those in the field to exercise powers of command. Rochlin (1997) refers to this as a "dream of being able to cut through the fog of war". He cites several disasterous historical failures based on these approaches. One example was the battle of the Somme in 1916. Rochlin argues this is an example of where "More of a premium was put on retaining control to assure that the battle went according to the pre-scheduled timetable than to managing the actual advance toward the German lines". The consequences of this resound through the history books. A critical point being that the troops on the ground could see the strategy failing within a few hours, but were prohibited by the control structure from exploiting any available advantages. General Haig, on the other hand, did not realise the strategy had failed until days later. Rochlin (1997) compares the minutely detailed and rigorous plans of the Fourth Army at the Somme with the battle of Waterloo where Wellington's victory was achieved against Napolean without a written battle plan.

Rochlin (1997) continues on to analyse the campaigns in Korean and Vietnam. In Korea, the US forces were not organised into separate units with self-contained targets that could be autonomously pursued. Instead the units were "loosely-coupled" which in practice meant encumbered with the need to negotiate with each other in "real-time". Such reliance on communication also allowed commanders to exercise too much control, not allowing room for discretion and adjustment, with potentially disasterous consequences. Rochlin (1997) also describes the information "pathologies" introduced by centralisation in the Vietnam war. Rochlin (1997) states that, based on the Vietnam experience, General Heiser recommended resorting to a less centralised system thus reducing requirements for information even if it created some 'slack' in resources. Rochlin (1997) concludes this section with the following quote from van Creveld:

"To study command as it operated in Vietnam is, indeed, almost enough to make one despair of human reason; we have seen the future, and it does not work" (pg 199).

Automation, Standardisation and Slack

By now we have established that Rochlin (1997) is concerned about the limiting effects of automation on:
  • individual human development;
  • the evolution of human knowledge and skills; and
  • safety in times of emergency
Rochlin (1997) argues that "what is lost in many cases is not just variety, and specific human skills, but the capacity to nurture, enhance, and expand them through the messy processes of direct, trial-and-error learning" (pg 213).

Rochlin (1997) calls the elaborate, long-term collective effects of computerising and networking everything the "computer trap". He is concerned that this process (in the large) is mostly unexamined and that its effects may be not only irreversible, but may also create large scale vulnerabilities in the social and socio-technical systems that are essential to managing the structures and complexities of modern life.
Rochlin (1997) argues that one factor behind all this seems to be a push to elimate from hazardous systems all possible sources of "human error". No matter what type of system is being dealt with this push tends to increase the tightness of coupling, increase response times and reduce redundancy. The final effect is that mechanisms of operation are deeply embedded in computer systems so as to make human operation in times of emergency impossible.

References

Braverman, H 1975, Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century, Monthly Review Press, New York.
Ford, M.R 2009 The lights in the tunnel: automation, accelerating technology and the economy of the future, Acculant.
Landauer, T.K 1995 The Trouble with Computers: Usefulness, Useability, and Productivity, The MIT Press.
Rochlin, G.I 1997, Trapped in The Net: The Unanticipated Consequences of Computerization, Princeton University Press.