DECISION ENHANCEMENT SERVICES
This page intentionally left blank
Decision Enhancement Services Rehearsing the Future for Decisions That Matter
by
Peter G.W. Keen and
Henk G. Sol
IOS Press
© 2008 IOS Press and the Authors. All rights reserved. ISBN 978-1-58603-837-3 Published by IOS Press under the imprint Delft University Press Publisher IOS Press BV Nieuwe Hemweg 6b 1013 BG Amsterdam The Netherlands tel: +31-20-688 3355 fax: +31-20-687 0019 email:
[email protected] www.iospress.nl www.dupress.nl
LEGAL NOTICE The publisher is not responsible for the use which might be made of the following information.
PRINTED IN THE NETHERLANDS
v
Contents Preface ..................................................................................... vii Introduction: A lens, an invitation, a field of practice ................. ix
PART I: DECISION AGILITY ....................................................... 1 Chapter 1: Decision Process Agility ............................................ 1 Vignette 1 (Decisions That Matter): Leading through Seeing: Transforming a Government Agency........................10 Vignette 2 (Studio): Enabling New Decision-making through a DES Studio .......................................................17 Vignette 3 (Suite): Simulation Modeling ..................................23 Chapter 2: People Make Decisions; Technology Enables Decision-making ...............................................................27 Chapter 3: Decision Enhancement in Action: The Airport Development Studio ..............................................44
PART II: DECISION-MAKING.....................................................69 Chapter 4: What We Really Know About Effective Decision-Making ...............................................................69 Vignette 4: Fact-based decision-making ..................................94 Vignette 5: A group studio ....................................................97 Chapter 5: Visual Thinking: If You Can’t See It, You Can’t Get It ..................................................................... 101 Vignette 6: “Grokking” the Infoviz ........................................ 119 Vignette 7: A Control System for Automated Ground Vehicles........................................................................ 123
vi
PART III: DECISION ENHANCEMENT SERVICES....................... 126 Chapter 6: Making Simulation Part of the Executive Culture, Not just “Modeling” ........................................... 126 Vignette 8: Toyota’s Decision Modeling Net............................ 146 Vignette 9: A Simulation for Call Center Management ............. 148 Chapter 7: Coming to a Decision: Studios: People plus Methods plus Suites ........................................................ 151 Vignette 10: A Learning Studio for Coordinating Distributed Work............................................................ 154 Vignette 11: An Inquiry for Urban Infrastructure Planning in Houston ....................................................... 158 Vignette 12: A Participatory Studio for Resolving the Group Support System Paradox ......................................... 162 Chapter 8 : Recipes: Applying Proven DE Experience............... 169 References ............................................................................. 181
vii
PREFACE Decision Enhancement is a field of practice aimed at extending lessons, principles and tools built up over a thirty year period, largely under the term “Decision Support.” Enhancement goes beyond support which has mostly focused on helping managers make use of interactive computer models and methods. Decision Enhancement substantially adds to the opportunities, especially in the use of the Internet as both an information resource and communications base for collaboration between groups but it also goes well beyond it in two important regards: a focus on enhancement the processes that influence the quality of decisions that really matter in an organization, the ones that are most consequential, complex, and uncertain and a shift from the design of computer- and telecommunications-based tools to a far more comprehensive “studio” approach too the integration of change management, collaborative development and use, and a new generation of visual technology. One of the aphorisms of Decision Enhancement is that “If you can’t see it, you won’t get it.” Decision Support was of necessity built around “numbers”; Enhancement rests far more on images, dynamic visualization and communicative display. Decison Enhancement is a lens that focuses on stakeholders in decison arenas and their decisions that matter. It continues with its invitation to bring stakeholders, domain experts and suite designers into its studios. The invitation extends to experts in domains relevant to making DE studios effective, as well as to the suite designers and providers of technology capabilities needed to provide the tools within the studio. Decision Enhancement provides services to guide a journey where executives, their advisors, change management specialists, experts in multi-disciplinary fields and technology developers can come together to make a substantive new impact on effective decison making in any organization. Our hope is that this book adds to the old—the proven lessons and expertise of these fields—and adds to their innovation. It is in this sense that it is an invitation and written to encourage reflection and discussion within and across them. This book is one of the products of a long colleagueal collaboration between the authors.
viii
They thank their common and respective environments for their support: the universities we have taught in, individual collaborators, companies we have worked with, academic conferences and our students. We owe a special word of thanks to Zenlin Kwee who has had a major contribution in shaping and finalizing the manuscript. A brilliant young scholar in her own right, she has added immeasurably to the quality of our ideas, their expression and the final book.
Peter G. W. Keen
Henk G. Sol
ix
Introduction Decision Enhancement: A lens, an invitation, a field of practice
The photographs above show four very typical decision-making meetings in progress. All of them make use of information technology – computers and telecommunications links. But the photographs tell us nothing about the IT contribution to the decision process. Nor can they tell us about the quality of the decision process. In the first photo, for instance, the largest group is perhaps going through a ritualistic discussion where they are careful not to make waves because the boss is making the Powerpoint presentation. Or the presenter is a skilled facilitator who will help the team address the issues openly and creatively. Maybe the small group of executives in the second photo is arguing about whose “numbers” are correct and losing sight of the wider
x
decision context. Or they are collaborating – which includes arguing with each other – to generate insight, apply their individual experience and make sound judgments that go well beyond what the figures and numbers imply. The four people standing at the giant screen may be analysts who will have little influence on the policy makers to whom they will report the results of the simulation modeling. Or they are the real decision makers. Finally, the team looking at the shared screen may end up reaching a consensus on ideas and recommendations but not making any commitments to action. Or they may use this forum to arrive at a creative, collaborative real decision. Beneath the surface of the four photographs are the same basics of decision-making: people, process and technology. They are increasingly interdependent. People make decisions; their skills, values, judgment and experience shape the decision. The decision process influences the likelihood of their making effective decisions. The technology can provide multiple types and levels of support to both the people and the process.
Figure A: Decision Enhancement: The fusion of people, process and technology through studios
xi
Improving the combination of all three of these factors in decisionmaking makes a substantial organizational impact. In general, they do not move together and may even be in conflict. People may resist process. Process may limit people, especially if it inhibits free expression or imposes what they see as artificial procedures. Technology may not fit with the needs of either process or people. When the people, process and technology do come together, the four photographs will still look the same but are fundamentally different. They are no longer meetings, but studios. Studios are environments designed around process. They include technology suites; integrated sets of tools focused on enhancing the process and the people contribution to decision-making. The studios and suites plus the facilitators and experts brought in to help people get most value from both of them, as illustrated in Figure A, constitute the key elements for enhancing the quality of decision-making. We use this integrative framework in this book to define effective strategies for fusing people, process and technology through services, studios and suites. We term this Decision Enhancement (DE). The practices of DE extend over thirty years of experience in building decision support systems, interactive technology tools for personal and group use by managers, in applying change management and facilitation methods for helping teams collaborate for the purpose of innovation, and in deploying analytic models and methods developed in the fields of computer science, economics, and management science. Decision Enhancement expands on this base to leverage decision processes for a new generation of decision situations and demands, a new generation of managers and a new generation of technology. It creates the space that we term Decision Enhancement Services (DES), composed of collaboration “studios” - facilitative environments - and “suites” integrated sets of technology tools.
Decision that Matters Decision Enhancement is a management lens, a way to look out at the dynamic and volatile landscape of complex private and public sector decision-making and, increasingly, their interdependencies and necessary collaborations. What that lens brings into focus is the opportunity to significantly enhance executive decision-making through professional practices that fuse human skills and technology and to do so in areas where the combination of people, process and technology has in general been of limited impact. We call these “decisions that matter.” The goal of Decision Enhancement is to help companies improve their processes for handling decisions that matter. DE enables a new style of
xii
decision process that has one and only one purpose: to make a real impact for stakeholders in handling the decisions that really matter in their sphere of responsibility. DE makes this impact by bringing together the best of executive judgment and experience with the best of computer modeling, information management and analytic methods. This facilitates the scenario-building and evaluation, collaboration, and appropriate use of interactive technology-based tools – the new generation of integrated sets of visualization-based simulations and models plus the ever developing Web-based capabilities – that literally rehearse the future in complex decision-making under conditions of uncertainty, urgency and risk. The rehearsing is analogous to war gaming in the military – decision gaming in the DE context. Figure B shows the characteristics of decisions that matter and a few examples of DE in action through studios and suites that are described in later chapters of this book.
Figure B: Decisions that Matter: Characteristics and a few examples
These are all decisions that matter. They involve multiple actors and stakeholders. Decision enhancement in all the cases rests on a fusion of people, process and technology. There is one extra ingredient to add to the DE recipe: that fusion rests on helping enhance visual thinking
xiii
through suites of visual technology tools. For stakeholders to work together effectively, they must build shared understanding. For DE services, this translates to enabling shared visualizations. DE suites, several of which have been used in the airport decision arena at e.g. Schiphol, Frankfurt, JF Kennedy and other major locations, rely heavily on visualization-based simulation models, multi-media interfaces and multi-level displays of information, including animation which adds the dynamic time dimension to representations. The studio is the forum for facilitating “what if?” scenario generation, analysis and collaboration. One of the central axioms of DE is that effective collaboration in handling decisions that matter rests on vision: envisioning, insight, and scenarios.
Decision Enhancement as a Field of Practice Note that so far we have not discussed technology except in passing. Technology does not make a studio. Such suites are effective if and only if stakeholders in a decision situation actually use them. This is why we describe Decision Enhancement as, first, a lens that looks at the decision landscape to highlight stakeholders and their decisions that matter and then as an invitation to them to enter the studio. This is not necessarily a physical room; a studio is often implemented as a Web portal or via videoconferencing. The key point to be made here is that the studio is a shared space. Decision Enhancement as a field of practice needs two more sets of invitations to make it useful and usable. The first invitation is to experts in domains relevant to making DE studios effective and the second to the suite designers and providers of technology capabilities needed to provide the tools within the studio. Decision Enhancement is not in any way new in its components: the innovation is to fit them together and also to add an extra capability or dimension to individual elements. Those elements are domains of skill, agenda and methods. So, for instance, one of our continuing invitations in the public-private infrastructure decision arena has been to colleagues who bring domain expertise in such areas as airport planning, organizational leadership, multi-criteria decision making methods, simulation modeling, and the arts. Similarly, experts in many disparate areas have contributed directly to Decision Enhancement initiatives, to mutual benefit in terms of innovation and impact. For example, there are many powerful and proven simulation tools for airport planning. They provide a wide range of capabilities. However, their impact on the decision process has been constrained by their original design, which was not for integrated, real-
xiv
time modeling. This means that they are used by planners and analysts but not directly by most stakeholders. Add to these capabilities the skills of DE suite designers, multi-media interface experts (many of whom work in the visual arts), and skilled DE facilitators and the simulation tools become even more powerful. There are many such areas of “invitation” that are rapidly extending DE practice. They include ones related to organizational decision-making: facilitation, change management and leadership. In the technology area, domain expertise that is helping shape a new generation of interactive suites includes multi-media interfaces, visualization tools, architectures for distributed simulations, Web Services, and data integration. Examples of specialist expertise in decision domains are gaming as a form of rehearsing the future, infrastructure policy and planning, business process design and management, and transportation. All of this innovation adds up to a new field of professional practice, as much organizational as technical. That practice is embodied in DE studios. Its audience is decision stakeholders and its target their decisions that matter. Its vehicles are the tools and processes that make its studios effective and the technology suites that leverage the studios’ value and impact. The diagram below summarizes Decision Enhancement as a field of practice. It starts from the lens that focuses on stakeholders in decision arenas and their decisions that matter. It continues with its invitations to bring stakeholders, domain experts and suite designers into its studios.
Figure C: Decision Enhancement – A Field of Practice
Decision Enhancement begins and ends with stakeholders. Its field of practice is a new style of decision process whose sole goal is to make a
xv
real and substantial impact for them. This then is Decision Enhancement for a new generation of just about everything – decision arenas, challenges, change and uncertainty, technology and many additional factors and forces. Our book is an invitation to adopt DE as part of your own and your organization’s practice, whether you design or use Decision Enhancement studios.
This page intentionally left blank
1
PART I: DECISION AGILITY Chapter 1
Decision Process Agility Our aim in Decision Enhancement Services is to make a contribution to increasing decision process agility in private and public sector organizations. Decision process agility is the combination of speed, flexibility, coordination, collaboration and innovation. Such agility has been a hallmark of other process innovations of the past decade that truly merit the term “transformational”. Examples are supply chain management, the best of business process management, electronic commerce, agile manufacturing, and business partnerships created through global co-sourcing of research, design and manufacturing. The leaders in each of these areas of innovation report cuts in cycle times and time to market of fifty percent1 and more, and comparable reductions in errors and rework, overhead costs, inventory and working capital, and improvements in quality, capital efficiency and organizational productivity. Agility has reset the standards of performance in logistics across the organizational landscape. Organizations need to reset their standards and modes of decisionmaking and target the same degree of improvement in decision process agility, for the obvious reason that the effectiveness of decision-making is more closely related to the effectiveness of the organization than just about any other single factor. Decisions are the choices that shape an organization’s future. The more important those choices are, the more critical it is to use any and every tool and method to improve the odds of a successful decision outcome. Speed of process is increasingly essential in order to respond to the pace of never-ending change. Equally vital is flexibility to respond to the growing volatility and uncertainties of the competitive, political, social and economic environment. The command and control era where the “boss” or a small group made the decision is obsolescent, if not obsolete, so that decision processes increasingly require coordination across functions, geography, stakeholders, and partners. Collaboration is a key to handling
1
For figures on the proven performance gains from supply chain management excellence, see Earle, N. and Keen, P.G.W. “From .Com to .Profit: Inventing Business Models that Deliver Value and Profit”, Jossey-Bass, 2001, Chapter 7, Perfect Your Logistics.
2
complexity; no one actor has all the information or skills to make effective choices. And there is no longer a status quo that allows next year’s decision to be a minor variation on this one’s; the issue is not whether to innovate but how. This is the context for effective organizational decision-making: decision process agility as competitive edge and even a necessity for survival.
Latent Decision Agility Information technology has improved decision process agility in many areas. Supply chain management and business process management2 have added agility to relatively routine and recurrent decisions, such as procurement and production scheduling where the clarity of business rules makes it practical to automate much of the procedure. Processing mortgage applications – that a few years ago took weeks and involved many functional units and specialists – is now a fifteen-minute authorization process, with technology enabling speed and coordination.3 Wherever a decision can be turned into a standard procedure, IT will routinize it. Then, it becomes faster – one element of agility – though at the risk of inflexibility. But that routinization bypasses many decisions that rest on organizational and human factors which are inevitable whenever they involve values, judgment and negotiation. Agility here must go beyond speed and coordination of routines. Flexibility, collaboration and innovation are the next level of requirement. Decisions that matter to the organization’s future are almost invariably of this nature. It is this type of decision situation that is the focus of this book. Our goal in Decision Enhancement Services is to show how information technology can be more effectively deployed than it is at present, in order to help achieve decision process agility: speed and flexibility and coordination and collaboration and innovation. The potential payoff is obvious, but the challenge is hard. Agility is not at all the hallmark of decision-making in most organizations and there is no reason to disagree with one expert’s statement that “decisions fail half the time.”4 Whether in the Boardroom, Halls of Congress, task force meeting, university, branch or head office, planning department, or government 2 BPM is a recent IT portmanteau term for the modeling of business processes in order to streamline them and automate as many of their procedures as is practical. See Finger, P. and Smith, H. “Business Process Management: The Third Wave”, Meghan-Kiffer Press, 2003. 3 Keen, P.G.W. and McDonald, M. “The eProcess Edge: Creating Customer Value & Business in the Internet Era”, McGraw-Hill, 2000. 4 Nutt, P.C. “Why Decisions Fail”, Berrett-Koehler Pub, 2002.
3
agency, decision-making is rarely as agile as those involved or affected want it to be, or as it could be if it matched the process transformations of, say, supply chain management. Any of our management readers can validate this assertion just by reflecting on key decisions that they themselves have been involved in or affected by over, say, the past six months. Even in the most efficient firms, they are likely to agree that the terms on the left-hand side of the diagram below come to mind far more quickly than those on the right hand one. They can then consider what it might be worth to improve decision processes and move along the agility spectrum on any or, preferably, all the components that it displays.
Figure 1.1: Moving decision-making units along the decision agility spectrum – from current states to opportunity states
The above diagram is illustrative only. It shows where a decision-making unit might stand today – the left-hand side of the diagram - and a reasonable target for improvement – the degree of progress in moving towards the right hand side. We do not, in any way, claim that all organizations are stuck on the extreme left or that the tools and methods that we present in our book are some magic rocket that will immediately shoot them to a new decision-making nirvana. We simply repeat our question “What is it worth to move your own decision-making units further along the decision agility spectrum?” We can show proven examples of how this movement can be achieved and present recipes for applying the lessons of DE experience. Recipes are our term for proven,
4
repeatable, adaptive and codified procedures that can be transferred across organizations. Recipes are methods built on practice and experience. Poor processes produce poor decisions. All of us have experienced the impact of or have been involved in processes where lack of agility impeded the quality of decision. Instead of speed, flexibility, coordination, collaboration and innovation, we are dragged back by delays and lack of urgency; rigidity and bureaucracy; multiple procedures and information systems, lack of knowledge and unwillingness to share knowledge; and organizational inertia. Supply chain management used to match this profile of pathologies. The reason that in this overview chapter of Decision Enhancement Services we refer to it is that it shows just how much has been achieved in eliminating them in the best-run companies in logistics management and, by implication, how much might be achieved in comparably enhancing decision process capabilities.5 The message is that things do not have to stay the way they are. Transformation is always possible, however difficult it may appear at the outset.
Decision Support Systems: A Foundation to Build on Decision Enhancement (DE) draws on sound theory and proven practice and builds on the application of principles and tools for implementing Decision Support Systems (DSS). Despite having a thirty-year track record in developing interactive tools for use by managers6, DSS stalled some years ago in contributing to organizational decisions that matter. It largely adopted a systems view, as have many of the technical and analytic disciplines that it applies. Considering such major pitfalls of DSS, DE provides a process, rather than systems, focus. Studios and suites targeted to decisions that matter comprise services to the people who make the decisions, not a technical product. DE goes beyond DSS in this focus on services not systems – and in three other areas of method and impact.
5 See Boyson, S., Harrington, L.H. and Corsi, T. “In Real Time: Managing the New Supply Chain”, Praeger Publishers, 2004. This summary of research provides detailed figures on both the economic impacts of SCM improvements in the U.S. economy and the leverage they provide to leaders in its deployment, in terms of just about any metric of organizational performance. 6 Keen, P.G.W. and Scott-Morton, M.S. “Decision Support Systems: An Organizational Perspective”, Pearson Addison Wesley, 1978; Alter, S. “Decision Support Systems: Current Practice and Continuing Challenges”, Pearson Addison Wesley, 1979.
5
¾Firstly, DE services enable enhanced organizational practice in that they extend the domain of focus for the application of information technology from the fairly structured operational decisions that are well-suited to spreadsheets and computer models to decisions that matter: multi-stakeholder, complex, value-dominated, uncertain and consequential. Stated more bluntly, DSS got stuck at the middle level of organizational decisionmaking. Spreadsheets and information management tools were equally stuck in the finance, marketing and production planning functions. Simulation modeling was mainly aimed at technical and engineering domains and not attached to the mainstream business decision process. The focus of DE on services, studios and suites targets all levels of decision-making – especially the Boardroom. If it gets stuck in the same way that DSS did, then many of these IT tools will remain constricted in their impact and value. ¾Secondly, DE services enhance the link between people and technology in new ways, particularly through the DE focus on visualization. The corollary to the DE axiom of “if you can’t see it, you won’t get it” is, ergo, build tools that help people get it. In practice, that means that decisionmaking is enhanced substantially by visual thinking supported by visual tools. DSS were largely numberbased, because of all the limitations of computer processing power and speed, telecommunications availability and capacity, screen displays and software tools. Those constraints increasingly no longer apply. DE services are thus multimedia-focused with numbers just one of the media, because human thinking and communication are multimedia in their very essence. ¾The third enhancement that DE adds to traditional DSS is to the decision process. There is ample evidence in management research and practice that there are systematic patterns of flaws in decision-making that once recognized can be just as systematically mitigated through a combination of process enhancements, facilitation, and use of appropriate analytic methods and computer tools. The systems perspective tends to focus on the content of the decision – goals, information, methods and solutions. Increasingly over the past decade, it has become clear that systems/groups rarely in and of themselves create effective processes. By emphasizing the studio – process, as well as
6
suite – system-based tools, DE offers practical ways for technical professionals to move their services from useful to usable to used. Today, they are instead far too often being stuck in proposed designs (usefulness), pilots and prototypes (evidence of usability) with limited adoption and diffusion (usage). Together, the DE focus on decisions that matter, visual thinking, facilitative studios, and technology suites that draw on software, telecommunications, data resources and analytic methods provide a new base for increasing decision process agility. We do not claim that any Decision Enhancement toolkit will suddenly shoot an organization from slow to speedy, inflexible to adaptive and so on. Progress will be incremental and come from sustained management attention, method and discipline. Technology offers no quick solution to the challenges that decision-makers face. Decision-making is complex, and the arenas in which it takes place uncertain and pressured by time. It is risky. It involves many psychological, social and political traits, trade-offs and interactions. It often demands courage and is affected by conflicting stakes and lack of information. It is also affected by “eptitude”. Yes, there really is such a word. Ept means “adroit, competent, appropriate, effective”.7 We include this term in our DE vocabulary as a reminder that such a skill base is a spectrum from inept to adept. The ept factor is a constant reality in complex decision-making processes and one of the priorities for DE is to enhance the skills of the adept and to help them in areas where they are inept in comparison to computers: calculations, statistical analysis, probabilistic reasoning and systematic information search, for instance. But however much contribution that information and computer services, databases, analytic models and software make to the decision process, it is humans, with all their strengths and flaws, who make the decisions that matter.
DE Technology Enablers We see the role of information technology in the decision process as a spectrum from automate to support to enhance. Automation emphasizes system and removes people; it treats processes as procedures to be routinized. Support focuses on system plus people, with process largely viewed as the effective use of the system by the people. Enhancement aims at a more complete fusion of the three resources, with a general tendency to place process first, people second and system third. 7
The Oxford English Dictionary
7
Decision enhancement is a path of intense effort. By its very nature, automation is the natural path of least resistance; it is the evolutionary direction for Information Technology (IT), with all the many social costs as well as benefits that this involves. The IT evolution always runs the risk of a downhill drift towards the ethos of automation. IT is at its strongest in structuring and routinizing processes. It seeks out decisions with clear goals and procedures that can be turned into software routines and the role of human judgment minimized. Examples are consumer loan applications, procurement, and production scheduling. One of the goals of some financial service firms is “straight-through” processing, where people are entirely taken out of the loop and software manages the entire decision process from customer request to completion. Credit card management is already marked by this decision automation. Humans are rarely involved in authorization of a transaction, setting of interest rates, or security alerts. The industry would be impractical without this automation. Decision Enhancement aims at complementing the automation imperative by moving IT in the opposite direction, towards more rather than less inclusion of people in the decision process through technology. DE creates a creative tension between the two dominant approaches to decision-making, the primacy of human judgment and interactions and the ethos of automation. The suite pulls towards the technical, analytic and routinized while the studio pulls towards the social and interpersonal. In general, that middle-ground creative tension is largely lacking in decision making for decisions that matter. IT has had a substantial historical impact on taking people out of the loop in automating many decisions. It has played a growing role in supporting the activities of decision-makers through e-mail, calendar management, spreadsheets, intranets and teleconferencing. But it has lagged far behind its potential in enhancing decision-making processes for complex decisions that are social or “political” in nature. In addition, very often decision-makers and their advisers have equally lagged behind the technology opportunity. Spreadsheets, for all their power and flexibility, are a very narrow component of a rich repertoire of available technology-based tools and methods that extend the opportunity to enhance human thinking and skills. These new tools, especially those that support visual thinking and sharing of insights and analyses through simulation models, show the dynamic link between human decision-making and technology. The model manages calculations and projections and the human manages the choice and interpretation of scenarios. People and models should move together.
8
“Should move together” does not mean that they have done so. Technology has both paced and constrained the type and degree of contribution that it makes to decision-making in two important ways. The first is the most obvious. It has paced new levels of information and analytic agility. Computers are fantastic at manipulating numbers and carrying out calculations. Thus, in the 1950s through to the 1970s they made major contributions in large-scale modeling and simulation in such areas as production planning, optimization of complex chemical and oil refineries, and transportation systems. In the 1980s, the spreadsheet brought number-crunching to the masses.8 But numbers are not the main medium of thinking and hence of decision-making. Technology has constrained many key elements of human skills in their application to decision-making. We are all highly visual and conversational in our working together. Television, animation, multimedia, graphical displays, and sketches on blackboards and even dinner napkins indicate the visual texture of our age. The technology excluded most of these until recently. Now, the Internet and increasingly powerful PC hardware have added graphics, rich text displays and video. A new generation of designers is extending these to provide new styles of interactive interfaces, simulations and information representations. Many of these are still in the experimental stage but they are the mainstream of next year in more and more instances. A hint of the new visual power tools is the film The Lord of the Rings. It makes a fictional world of fantasy very real. Decision Enhancement Services points to making the uncertain future that decisions that matter address equally real. It will take time to move the power of The Lord of the Rings to the desktop and enterprise telecommunications network but that is just a matter of time and money.9 (There is not a spreadsheet in sight in the film, of course.) Much of this new computing power is being used for entertainment; it has created a new games economy. Without pushing the metaphor too far, there is a strong conceptual link between game-playing and rehearsing the future in decision situations. Simulation modeling, “what if” analysis via spreadsheets and other software tools, war gaming in the military and scenario analysis are all well-established approaches to handling uncertainty and exploring alternative options – rehearsing the future. 8
Wagner, H.M. “Principles of Operations Research: With Applications to Managerial Decisions”, Prentice Hall, 1975. 9 For our technology-oriented readers, here are a few figures on that technology power, which dwarfs that of all but a few companies: Lord of the Rings used 3,200 processors, 10 gigabit per second network speed, 60 terabytes of online disk storage and 72 “near” online, 0.5 petabytes – 50,000 DVD capacity – digital tape backup, and 1,600 servers. From our viewpoint, the issue is not “gee whiz” technology but the fact that all this works, is commercially available, and represents the universal move of computing from numbers to visuals.
9
All these developments in technology enable new decision-making processes via studios and suites. However – and the however is a major caveat about the application or rather the applicability of technology – decision-making still rests on human and social values, interactions, judgments and experience. One of our arguments in Decision Enhancement Services is that unless the powerful – useful – tools of technology and the analytic sciences are embedded in the processes and interactions by which real people do their best to work together to make choices and commitments – usability – they simply will not be used. Decision Enhancement rests on fusing usefulness, usability, and usage. We end this overview of Decision Enhancement by presenting three short DES “vignettes” – mini-case examples that illustrate our three main areas of focus: decisions that matter, studios and suites. The first vignette describes how the newly appointed leader of a government agency took charge of launching a complete transformation of the organization: very much a decision that matters. The second presents a studio that has significantly added to decision agility in an international insurance firm’s handling of risk management. The third shows how a Decision Enhancement suite helped build a new portal for real-time, distributed decision-making in supply chain management decisionmaking in the U.S. Air Force.
10
Vignette 1 (Decisions That Matter): Leading through Seeing: Transforming a Government Agency Anna Johnson is the Director of a national government agency. She is a political appointee, brought into the agency, which we will term “DOQ”10, to make waves. DOQ is fairly typical of large government units: full of good employees, low on morale in an era of constant cutbacks and budget pressures, held back on innovation through the many constraints imposed by legislative and administrative rules, and viewed by the press and public as doing a reasonable but not outstanding job. Anna Johnson’s appointment, however, was made as part of a strong move by the national government to improve management accountability, with measures of performance mandated by legislation. In that sense, she is an outsider and not a career government employee. She had been an executive in a major consulting firm that specialized in public sector planning and project management. The Cabinet and National Administration are demanding a new style of operational management in agencies. The Office of Budget and Planning has been given the authority to withhold funds from agencies that do not get to “green” on a scorecard of performance measures (Red means unacceptable). DOQ’s first ratings prior to Johnson’s appointment were more red than green, with most of them in the neutral “white” category; it was very much on probation, with substantial improvement required and outsourcing of many functions to the private sector a likely scenario if that progress did not occur. DOQ already lacks funds. Of the two billion Euros – roughly USD 2.3 billion at that time – in its 2003 budget, only 80 million is discretionary; the rest is pre-allocated based on legislation, pension and other entitlement commitments, and on funds being tied up by ongoing projects. On one hand, it cannot afford to innovate, for financial reasons. On the other hand, it cannot afford not to innovate, given the pressures for performance. DOQ had been trying to move fast and aggressively. When Anna Johnson arrived at DOQ, there were over 350 major change initiatives underway, including ones to improve procurement, provide citizen service, outsource many facilities and 10
DOQ is obviously not its real name; we use pseudonyms in many of our examples to protect confidentiality and permit us a candid discussion.
11
processes and make “green” in the e-government areas of focus that are part of the national blueprint. Johnson was overwhelmed with information on all these initiatives. There were consulting firm reports and studies that added up to thousands of pages, many detailed forecasting models and statistical analyses, plus data-gathering operations and information access systems in just about every unit of the agency. Initially, Johnson assumed that the necessary first step in positioning to make decisions on prioritizing an accelerated transformational change program would be to sift through and synthesize these. She ordered reports on the documents and a series of presentations summarizing the business cases and status of each project. This process was a mutual frustration for herself, her key staff and the organization. She put it on hold after a month and shifted to what she termed “roadmapping” the agency’s future plans. This strategy for taking charge of change rested on visualization, in order to simplify analysis and prioritization by providing a simple overall view of the projects with the ability to “drill down” into more detail here needed. This approach is increasingly being adopted by organizations in their monitoring and planning; a common term is “dashboard”: a set of visual gauges for calibration and navigation. The dashboard is connected to suites of information and transaction processing systems that “feed” the displays on a periodic timetable. Anna Johnson wanted more than a dashboard. She needed to map projects, responsibilities and most of all, commitments. Her comment was that “A dashboard just tells me where we are. I need to know where we are going and how to get there. And I have to know who the people are, who are going to make them get us there.” She commissioned a “gameboard” from a small consulting firm that specializes in helping government executives take the lead in handling radical change agendas. The gameboard is a very simple map that lays out the phases of the transformation process at an overview level, which can be then broken down into layers of details. It is decision-centered; steps and activities are defined in terms of the management review agenda and timing and of the charters, commitments and rollouts required to move ahead. The gameboard provides the “working view” of the decision process. It is a rolling view in that the road map it generates can be scrolled backward and forward on the screen – left and right – rather than using the more common PC display of scrolling up and down. This may seem a small point but it is part of visual thinking. We do not naturally view time as moving up and down but along a horizontal axis; it is a flow.
12
The gameboard adds vertical information in the form of simple lists of actions and outcomes. This gameboarding process was highly disciplined and was aimed at turning complexity into appropriate simplicity – a literal road map. All the many interviews, documents, spreadsheets and proposals gathered and analyzed by the team of twenty staff that Anna assigned to the DOQ Transformation Agenda, were converted to simple diagrams that captured the activities, decision agendas, timetables and outcomes. The team meetings were often contentious and difficult for the participants. The lead facilitator, who has a long track record in the Group Decision Support Systems field, pushed participants hard to get away from lists, explanations, and reports, the lifeblood of government and the many consulting firms it uses. He insisted that they show where the item they were discussing fit on the gameboard. “What’s the task? What tasks feed into it – draw me the links. Does this need team reviews? Executive review? Recommended follow-on? Who owns this? Who takes on the responsibility?” The gameboard became almost a living entity. It was skillfully designed not to look high tech, but as if it were hand-drawn by a skilled illustrator, even though it was generated by a PC and displayed and updated on a large screen; this made it more invitational, natural and easy to relate to. A cardboard copy about four feet long and three feet high was handed out at the start of each meeting and updated at the end of it. The suite that generated the gameboard display used a simple visual interface for making the updates plus for accessing a range of resources such as databases (budgets, timetables, etc.), stored documents and reports, forecasts and spreadsheet capabilities. Together, the suite and the facilitative studio provided the process base and the printed gameboard “panel” was the shared input to and output from that process. When Anna Johnson took on her executive position at DOQ, she had a fairly clear idea of what she needed to achieve. Her most immediate priority was to focus the agency on a small number of initiatives that would move it to a new level of management efficiency and greatly improve its service to citizens, and national, regional and local government. Her main worry was that any task force would end up becoming a decision-avoiding mechanism, with so many stakeholders, so much information, and so many budget and planning issues and documents. The first month of reviews confirmed her concerns. The process was bogged down, not moved forward, by all the data and analyses. None of her key executives and staff had a clear idea of where things stood or what to do. One of them summarized the situation as “we have all the bricks, windows and doors but no plan of the house.” Anna decided to halt the entire process and regroup. She convened a team
13
of twenty staff, representative of the agency middle management culture and their units. Their first role was to interview senior DOQ executives and other key stakeholders to get a clear sense of their priorities and also to identify successful initiatives that could provide a springboard to build momentum to move much more aggressively forward for change in the agency. This helped cull the 350 initiatives down to just 19. This gameboard panel display played a valuable role in communicating within the agency, with executives in other government units and with the Office of Budget and Planning; it was tactile, attractive, portable, simple and communicative. Anna Johnson often propped it up at meetings and passed it around. It became a vehicle not just for shared understanding but shared involvement. It was very invitational and far more appealing than slide presentations, because it is simply captured the time flow and key steps of the process in a dynamic form; slides quickly lose the context of the process. The same is the case for many of the project management and calendar scheduling tools that the gameboard replaced. Figure 1.2 shows a typical example of the displays such tools provide. These can be of great value in project management but not in helping a team shape and stakeholders assess decisions.
Figure 1.2: Typical project management output displays
14
Using the gameboard panel did not require high-tech tools and setup. That helped in meetings with staff, citizens and journalists. The studio did involve high tech, of course, in order to generate the panel from the many discussions, reviews and recommendations. A deliberate choice was made not to set up the suite for use in outside meetings and presentations, even though that was fairly simple to do and required just a PC and projector. Anna and her advisers felt that the panel was the key to effective communication not the technology that generated it. The gameboard is, at one level, trivial. It is a simple road map of the decision stages and outcomes in Anna Johnson’s transformation program. It is supported by plenty of technology but many of its displays could have been produced without it. The gameboard is the visualization of the decision process: planning, phasing, design and implementation. It is a navigation, communication and coordination aid.
Figure 1.3: The Group Decision Room
The same gameboarding process was applied in the planning and implementation decision processes: making commitments, assigning personnel and phasing programs and activities. Anna saw this as a navigation process: “A new leader has to first figure out where we are going, then show the others and then keep everyone in the picture and on track.” She called this “setting the compass” and “framing the direction.” The studio was not housed at the agency but at the offices of the consulting firm. It was the now standard group decision room, with wireless PCs linked to a Web portal, a facilitator and “domain” experts.
15
The goal for each meeting was not to come to consensus but to commitment. Audio and video links connect participants in distributed locations to the central meeting room and via the web portal to the DES suite tools. Figure 1.3 shows the room. There are hundreds of thousands of such rooms in use today in businesses, government organizations and universities. It is how they are used when they are filled with people that make them studios. For Anna Johnson, the studio was a workplace for intense collaboration, facilitated by an expert adviser and supported by a technical team that handled the suite of tools behind the gameboard. Anna stresses how vital it was for her to commit personal time and involvement to the process. She met daily with the teams; the gameboarding structure and outputs helped keep focus and momentum. The teams worked weekly in the studio, with herself present for most of the sessions. She assigned many follow-on tasks, and again the studio provided the organizing structure and vehicle for the process. “It is a constant process for a leader of listening – constant.” She feels the results were well worthwhile, as did her superiors. She formed the task force in April; by October, five of the selected nineteen initiatives were already being implemented. The overall strategic plan and commitments were in place a month later. Members of the team told us that this would have taken years using the standard task force approaches, if it ever got done at all. The gameboarding approach that the consulting firm introduced has evolved over several years and now forms a recipe – a proven and repeatable process – for helping public sector leaders take change of comparable transformation initiatives. The head of the firm stresses the importance of focus. “It’s not about playing with decisions at the concept level – these guys all love to get stuck in the details and then they think that they are deciding things, but really all they are doing is making “to do” lists for other people. You have to help them make a navigation plan and most of all you must get them to commit. Once it’s on the gameboard, it’s a commitment and they all see that.” To see it is to get it! In terms of decision process agility, the main lessons from this vignette are how much agility a simple and focused process can generate when the studio and suite are targeted to building momentum and collaboration. In addition, it is worth noting just how important visualization was in achieving this. Gameboards, dashboards, decision cockpits and navigation maps are a rapidly growing aid to collaboration. They helped give shape and focus to collaboration and coordination in ways that the mass of other information resources, including reports,
16
analyses and models did not. It was not the case that these were not useful, and indeed they were often accessed via tapping on the options menu on the PC Screen display of the gameboard. They needed, though, to be put into context – and context is almost invariably visual in nature. It’s called maps. We all need and use them everyday.
17
Vignette 2 (Studio): Enabling New Decision-making through a DES Studio The 2 x 2 matrix shown below has helped shaped the development of tools for collaboration. Developed by a pioneer in the use of technology to enable collaboration,11 it shows four modes of communication – same place/different place and same time/different time. Without information technology, there is no possibility of supporting the different place column of the matrix.
Same time
Same place Conversation, meetings, flip-board, slide projector
Different time
Shared room, billboard, reading corner
Different place Telephone, video conferencing, document sharing, chatting Letter, fax, e-mail, voice mail, computer conferencing, shared database
Figure 1.4: Classification of media based on time and place constraints
12
Our second vignette describes a DE studio that shifted a process from same place, same time to different place, same time meetings. In doing so, it helped improve process agility by transforming coordination and enabling a new degree of collaboration. Risk Management, a key responsibility for both legal and business reasons in more and more companies, is typically handled on a same place/same time basis. It emphasizes identifying, assessing, controlling, mitigating and monitoring risks in projects that involve new products, capital investment, meeting regulatory requirements and many other areas. Given the huge business failures of recent years, including Enron, Worldcom and Andersen in the United States; and Ahold, Lloyds of London and Barings in Europe; risk management as a formal discipline has become a decision that matters, not the administrative nuisance it is often treated as. Legislation has made it a formal requirement, with 11
Johansen, R. “Teleconferencing and Beyond: communications in the office of the future”. McGraw-Hill Data Communications Book Series, 1984. 12 Van Laere, J. “Coordinating Distributed Work: Exploring Situated Coordination with Gaming-simulation”, Delft University of Technology, 2003.
18
substantial penalties for failure to provide adequate oversight. Decisions concerning risk management really do matter. But they are rarely handled as if they matter. Rather than being a decision-making process, risk management (RM) is often a decisionavoiding activity. In the insurance company for which this studio was developed, it creates substantial burdens on productivity. Those responsible for oversight of risk management meet several times a year; since this is a global firm, this obviously means substantial travel, coordination and cost. The parties involved see it as “something that has to be done: a necessary evil.”13 They get no benefit from having to handle the organizational regulations and international law; these get in the way of their own work. They have to carry out desk research, document activities and then make interviews, fly to meetings and spend days on brainstorming sessions. This is all time-consuming. But it simply has to be done. The explanation for the “necessary evil” statements turned out to be that the teams are “very occupied with real work.” “We don’t care about risks occurring at other units; that’s their business.” The interviewees felt that only other teams would benefit from their efforts in identifying, assessing and mitigating risk. They themselves already “know” what the risks are in, say, a software development project or insurance portfolio management. They also feel a lack of confidence that their own contributions will carry any weight among a large group of people they do not know and who meet at some remote location to go through a risk management agenda. Distributed teams carry out their activities in many places and at many organizational levels. The case study showed that teams working at the highly distributed operational level in field locations think differently about risk management than those at the strategic level, which is more head-office centered. A lot of time was wasted in the central meetings on discussions of what the relevant risks are that should be on the meeting agenda; people tend to talk around this issue, frustrating many participants who respond “but that’s not our own concern in our country/unit/location.” Efforts to apply standard group support systems (GSS), using decision rooms like the one shown in Figure 1.3 in our first vignette, had had mixed results. GSS generally aims at structuring the sequence of the meetings and agenda. The participants typically start by throwing out 13
De Vreede, G., Koneri, P.G., Dean, D.L., Fruhling, A.L., Wolcott, P. “A Collaborative Software Code Inspection: the Design and Evaluation of a Repeatable Collaboration Process in the Field”. International Journal of Cooperative Information System 15(2): 205-228, 2006.
19
ideas, which are sifted and organized. The system keeps track of all the conversations and displays the information as part of the interactive participatory process. Comments are invited and consensus built. This whole process reflects an evaluative voting. Figure 1.5 shows an example of a GSS display that summarizes a team’s evaluation of potential capabilities to be included in a software product.
Figure 1.5: Example of a GSS display
This display summarizes votes. The system also shows patterns of disagreement that helps the group to resolve uncertainty. In the risk management instance, anonymous voting, one of the most frequently employed strategies in GSS practice, had the disadvantage that teams often wanted to know exactly who was saying what, in order to estimate its relevance and value. They also wanted to make sure everyone was contributing. Yet without anonymity, they felt that hierarchy strongly influenced the process, with a superior’s opinion driving the discussion and conclusions. This impedes the very essence of risk management, which should be a dispassionate effort to identify all risk elements. There were many other problems and areas of dissatisfaction. Experts identifying risks were often not the same people monitoring them. Because of the heavy reliance on local self-assessment and judgment, there was a lack of a clear process for integrating these assessments, which made it hard to prepare for meetings and workshops, and no clear
20
link to action and follow-on initiatives. The process was more one of review than decision-making, even though decision-making was its entire intended purpose. Communication before and after the meetings was fragmented. This situation is typical; there is often an almost ritualistic flavor to much of risk management. A survey of 400 financial institutions found that over 80 percent of the companies relied on individual selfassessment of risks, with only around half of them having some type of process for teams to come together to review the assessments. The entire risk management process is distracting because there are so many factors, so many subgoals at the operational level, and so many gaps – and people – between the processes that establish the risk factors and the monitoring and controls to handle them. This, then, is the problem situation: reactive, lacking clarity and conflicted. The solution is to turn it into a far more proactive decision process via any-time, any-place collaboration. The authors of the article describing the solution report on their work in developing a Decision Enhancement studio that focused on improving the productivity of distributed teams involved in Risk Management (RM) in the insurance company. The goal was to transform the decision process and also institutionalize it, with specific payoff targets being to help prevent business failures, rework and overkill in protecting against risk. The target of the new decision process was to: ¾Identify all risks at an early stage and control them before any major business disaster can escalate; ¾Involve stakeholders at all levels in an iterative way, in order to create a shared understanding; ¾Provide the mechanism for facilitating open exchange of information and ideas, risks and their effects; ¾Help drive accountability for risk management; ¾Increase the likelihood of project success; ¾Add value to the business. The core of the studio design is the “script”: a two-level structuring of the overall process (top level) and “thinkLets” within the process to facilitate and guide RM collaborative activities. ThinkLets are recipes for
21
a repeatable, predictable, and transferable collaboration process.14 The script is a flexible but systematic sequencing of the thinkLets. The need for a script is to provide clarity of process, limit the cognitive load of information-gathering, discussion, evaluation and interpretation, and create patterns of collaboration in a predictable, repeatable way. Figure 1.6 shows part of the proposed script as a flowchart (facilitation process model) and lists the thinkLets each step involves. For example, if the facilitator in consultation with the group feels that there is a need to identify more risks than have so far been included in the discussion (top left corner of the figure), the DirectedBrainstorm thinkLet is used to generate divergent thinking, followed by FastFocus in order to narrow down the new options generated, and then PopcornSort (organize the options) and BucketWalk (evaluate them).
Figure 1.6: The risk management script15
14 Briggs, R.O., Vrede, G.J. de, Nunamaker, J.F. jr., Collaboration Engineering with thinkLets to Pursue Sustained Success with Group Support Systems. Journal of Management Information Systems Vol. 19 No. 4, Spring 2003 pp. 31 – 64. 15
Source: Adapted from Briggs et al., 2003: 52
22
The studio was applied in eight locations as a pilot project. It began with two half-day face-to-face workshops facilitated by one of the authors of the article that we quote from. The full RM process was carried out, and modifications and extensions made in collaboration with a group of RM experts. The process has three parts: Risk Identification, Risk Assessment, and Risk Mitigation. As Figure 1.6 indicates, each activity uses certain thinkLets, whose names are listed – Crowbar, Expert Choice, etc. The thinkLet defines the required tools, their configuration for this specific context, and the facilitation required to carry them out. The studio is being implemented by twelve international trainers who will train others; this first team was trained by the authors of the article. The studio uses proven off-the-shelf software suites developed for GSS facilitation. There can be no question but that the effectiveness of the suite depends on the studio, which depends on the skills of the facilitators. A major goal in developing scripts and thinkLets is to reduce the reliance on individual facilitators and provide a recipe for designing and running the studio events. The main challenge now is to institutionalize the studios. Even though the contribution to decision agility is substantial and measurable, there is as yet no organizational “home” for the program. The studio is evolving. The thinkLet recipe is a distinctive innovation that systematizes the sequencing of the process and collaborative steps while retaining flexibility and adaptivity. The results of the early applications show a substantial reduction in the negative comments and attitudes – and some real decisions.
23
Vignette 3 (Suite): Simulation Modeling Our third vignette is the most technical and shows how a powerful suite enables a powerful studio that in turn enables decision enhancement in a complex distributed arena. In our emphasis on people making decisions and technology enabling them, we do not in any way want to discount the value of the technical element. The simulation capabilities that we review in this vignette make possible entirely new decision process capabilities. The vignette also captures the building block approach to simulation development and integration of models with databases, spreadsheets and transaction processing systems that facilitate DE suites. Teams of professors and students at the Technical University of Delft and University of Maryland have been partners and colleagues in a number of projects. One of these has been to help create a supply chain management (SCM) portal for the U.S. Department of Defense (DoD). The goal was to build an SCM simulation that addressed the technical challenges of architecture, blueprints and configurations that integrate a model into the decision processes of a distributed organization.16 There are plenty of excellent supply chain management software systems and models available on the market but turning them into Decision Enhancement tools poses many problems: ¾The supply chain process – in this case for real-time management of parts for the F101 fighter plane – is highly distributed and involves many actors with varying responsibilities and authority, and with data also widely distributed across the military. Consider the decision challenges of handling repairs to an F101 operating in Afghanistan out of the air force base in Diego Garcia, which is located in the middle of the Pacific Ocean. The parts may have to be obtained from distribution centers in the U.S. or Europe – and now!!!
16
Van der Heijden, M.C., Saanen, Y.A., Van Harten, A., Valentin, E.C., Ebben, M.J.R., Verbraeck, A. "Safeguarding Schiphol Airports accessibility for freight transport The design of a fully automated underground transport system with an extensive use of simulation". Interfaces Vol. 32, No 4, July-August 2002, pp. 1-19.
24
¾Security is a primary requirement for the simulation, access to the portal and all data. The more the links and the more the parties involved, the more difficult it is to balance access and control, communication and security. ¾The military is a paper-centered organization that needs and plans to move to one that operates in real-time and thus with real-time information. The paper covers many thousands of topics: parts, configurations, diagnosis, maintenance, inventories, shipping, etc. ¾Any DE tool should provide the appropriate level of support and information for many types of decisionmaker, from the locally-based mechanic to the central SCM planner.
Figure 1.7: The SCM simulation architecture
Whereas our first two vignettes center on decision situations where the variety and numbers of parties, data, systems and locations were limited, the F-101 SCM arena is about as complex as any can be along any dimension and it adds a new level of demand for agility: real-time response. This is not a situation where there is time to bring people together for meetings or debate about alternatives. An F-101 is sitting on an airfield in Diego Garcia or Turkey and it needs a replacement part in short supply that must be flown in from either Frankfurt or Honolulu.
25
Coordinating the many elements of information and decision in real-time demands a technical architecture for integration of all the technology components. Figure 1.7 shows the overall simulation modeling environment and linkages. Figure 1.7 illustrates one of the key needs in DES simulation suites rather than computer science simulation models: linkages to other systems and services. The ERP systems, that the portal draws data from, are massive and very complex. Field data collection systems, instant multimedia conferencing, and collaborative planning and forecasting tools are also complex and massive. Real-time messaging among all these distributed components of the DES portal capability is in itself complex. The simulation model was part of the overall initiative that led to the decision to fund the supply chain portal. The Department of Defense had contracted for a pilot project that would demonstrate the characteristics and effectiveness of a portal-based architecture for managing defense supply chains. A major objective was “to illustrate the future to supply chain planners across DoD, to highlight the possibilities of the Internet and portal technologies, and to create a prototype of an infrastructure for real-time supply chains.”17 In other words, to rehearse a future. The simulation has been extended and used in gaming, which is a natural part of helping teams explore and learn in a decision domain. Players can vary decision parameters and study the outcomes of the model, both during the run, as it dynamically updates the supply chain indicators, and after the run. The indicators include stock levels, production times, cycle times and costs. As actors, they literally see the flow of the supply chain rather than get reports on scenarios and outcomes. This use of simulation as a DE Enquiry studio is powerful in building shared understanding and collaboration. It prototypes the decision rather than the model.
Conclusion We end this first chapter of Decision Enhancement Services by showing again the diagram of the decision agility spectrum that we introduced in its opening pages.
17 Verbraeck, A., Corsi, T. and Boyson, S. Business Improvement through Portaling the End-to-End e-Supply Chain, Transportation Research Part E: Logistics and Transportation Review, Volume 39, Number 2, March 2003, pp. 175-192(18).
26
Figure 1.8: Recapture from Figure 1.1
The DE studios made a worthwhile contribution to each of the agility factors. That is most marked in the DOQ example (Vignette 1), where the current state was rapidly shifted to an opportunity state that existing decision processes had not facilitated. In the Risk Management vignette (Vignette 2), the studio significantly moved the organization from fragmentation and conflict to coordination and collaboration. The main contribution of the Supply Chain Management simulation (Vignette 3) was to help create a speedy base for the decision on the portal and is use in gaming helps build collaboration. Those are worthwhile achievements. They are illustrative of current DE practice.
27
Chapter 2
People Make Decisions; Technology Enables Decision-making People make decisions. Technology can enable them to do so more efficiently. Whereas much of the focus of the application of information technology is to take people out of the loop – very efficiently in many instances – that of DES is to use technology not to replace or support what they do but to enhance and extend their capabilities. In the three vignettes that end Chapter 1, there is not a single instance where the decision makers had to subordinate to the “computer” either their own judgment, their mode of work, or the interactions between each other. The gameboard in the government agency was a natural organizing focus for collaboration and planning. The risk management studio removed many of the existing blockages to an effective and timeefficient process. The supply chain management portal added to the ability of distributed decision makers to handle their own part of the process while being kept informed about other parties’ work. As with using e-mail, there were of course some adjustments to make, conventions to follow and procedures to learn but these are by definition minor. We say “by definition” because if they were major then the decision-makers simply would not use the suites and would either not enter the studios, or if they did so they would provide literally only lip service – words, not commitment. But, all in all, the DES approach made technology “transparent” – natural, invisible in its details and mechanics and at hand. Behind each of the DES interfaces – what the decision-makers see and interact with – are many IT components; some of them are complex, such as the simulation modeling software “engines” that drive the supply chain portal, the databases that feed the models, and the real-time algorithms that drive the simulation models. The suite makes these accessible – and transparent – via the interface and the studio makes them relevant and of value to the decision process. This tailoring of the technology to the people and the decision context is core to DES and, we argue, to the Decision Enhancement lens; it is its fundamental focus on decision-making.
28
Technology: A Missing Resource for Decisionmakers Such tailoring of technology to people should be made routine in applying information technology to decision-making. People need all the help that they can get from advisers, information resources, analytic aids, colleagues, planning documents, outside experts, and collaborators. So, logically, IT should be a major part of this help: databases, computer models, telecommunications and software tools. In practice, however, computers have had little direct impact on the processes of the people who make the decisions that really matter to an organization. Our management readers can quickly assess that assertion. Consider the most challenging and personally important decisions that you have been responsible for as an individual or in a team in the past six months – planning a key reorganization, approving a high-risk business proposal, recommending action on a new IT proposal, making a hiring decision, or launching a new promotional campaign, for instance. Which simulation models did you use to help you assess potential scenarios? Did you bring the key stakeholders together to test assumptions, evaluate options and express preferences via a group decision support system? Were you personally involved in using computer-based information- and modeling aids? Are any of the three vignettes in Chapter 1 templates for how you and your colleagues handled the decision process? Probably not. If you are a masters degree graduate – in business administration, science, or public management – how many of the analytic techniques you learnt at school that most interested and impressed you – statistics, marketing models, financial theory, or decision theory, for example – do you now apply through the Web, decision rooms or PC software? We can extend the question to our readers who are in the field of information technology. Pick out the most important decisions that you and your team have had to make recently. Which IT decision aids did you use collaboratively? Maybe some speadsheet software, data base inquiry tools and lots of e-mails – that is probably it. To our consultant readers we ask the question: are decision support tools embedded in your many team meetings and planning sessions? To our academic colleagues and technical professionals in the decision support systems field, the question is which tools in your own area of expertise – simulation models, expert systems, relational databases, statistical analysis, multimedia software, group DSS, etc., do you use? Which of the systems that you build go beyond prototypes and pilots and are used on an ongoing basis by executive teams?
29
If the answer to our questions above is “Yes, I do make direct and personal use of interactive computer and telecommunications tools and so do most of our decision-makers” then your organization has a competitive edge, one apparent in companies that develop what may be termed a decision discipline; formal processes supported by an actionfocused culture and well-integrated technology. Our follow-on question then is “Are you making the most effective use of the tools? In particular, do you employ them to enhance your decision-making for decision that really, really matter?” If you do, then we hope our book will help stimulate even more ideas for innovation and action. If you do not, then it offers ways to add to your existing uses of interactive technologies and the design of new studios. If your organization is not routinely drawing on Decision Enhancement capabilities, then it is wasting an opportunity. The wealth of available superb, low-cost interactive information management systems, simulation modeling software, aids to team collaboration, and interactive information access capabilities grows by the month, with Web-based technology making them more and more accessible and flexible. Our main focus in Decision Enhancement Services is on the “D” of Decision Enhancement. Decision is the target of opportunity, enhancement the means, and services the combination of process capabilities and technical vehicles. IT tools – systems – are already well embedded in most routine areas of decision-making. PCs are now as essential in business life as phones and the Web has greatly enhanced the communication, information flows, and access to analytic tools that underlie everyday decision-making. But the evidence is strong that information technology has had relatively little productive impact on the people who must handle the many complex decisions that involve judgment, where information is not sufficient to point to the single best choice, and where many parties’ interests and values must be addressed. A 1999 survey by the consulting firm PriceWaterhouseCooper reports that only 20 percent of managers use Decision Support Systems, even though well over 90 percent use computers for Web access and e-mail.18 IT is used for administrative support, meeting support, analysis support, information support, communication support and presentation support but not for direct enhancement of critical decision processes. An old story captures this – and in a way reflects the thrust of Decision Enhancement Services. A distinguished professor at the Mega Business School in the 1970s – call him H.A. – was offered a professorship at
18
Carlsson, C. and Turban, E. “DSS: directions for the next decade”. Decision Support Systems, Vol 33, 105-110, 2005.
30
another major university, with a substantial salary increase, large research funding and free time to work on his personal projects. He was one of the leading figures in Bayesian Decision Theory, the use of probabilistic assessments to evaluate alternative decision options; it is often termed “decision theory.” A colleague reported, gleefully perhaps and certainly to anyone in listening distance in the faculty lounge, that H.A. was in a dilemma about leaving Mega and was canvassing friends about what he should do. This colleague saw him ruminating in his office, head in his hands. He commented that, “H.A, it’s easy. All you have to do is use your own theory and work through the decision tree.” H.A.’s reply was “No, I can’t. This is a really important decision.” The more important the decision and the more complex it is, the more likely it will be that technology and theory get thrown out. H.A. needed to trust his judgments, his experience and his friends, not decision theory. At the time, he could have only limited computer tools to draw on, anyway. Interactive technology was in its very infancy, screen displays were both too slow and too small to display decision tree diagrams and computing power was prohibitively expensive. Today, he would have access to a system on his PC (See an example of a typical display, shown in Figure 2.1), but he might still say “Yes, but this is too important to trust to a computer”? Very probably.
Figure 2.1: Example of decision trees
31
The technology-based fields of information management, customer relationship management, simulation modeling, project management tools and many other ever-improving applications of IT tend to be designed and implemented under the implicit axiom that better software and more information mean better decisions. The information- and technology-rich CIA, FBI and their foreign counterparts’ experiences in handling international terrorism show that if information is not communicated, shared, interpreted and used effectively, technology in and of itself cannot compensate for gaps in decision processes. In business, the dot com surge and collapse demonstrated that companies in the information and technology business were making disastrous decisions – with information everywhere around them. The trillion-dollar failure of the telecommunications establishment in North America and Europe came despite many analytic studies, models, forecasts, and surveys.19 The sustained successes of such companies as Dell, Wal-Mart, Southwest Airlines, and the erosion of previous leaders such as Acer, Kmart and British Airways at the end of the 20th century clearly reflect differences in the effectiveness of leadership decisions and business strategies, but the winners and losers were equally information- and technology-rich. For a wide variety of reasons, the winners were more effective in their decision processes. Indeed, we argue from our observations of organizations over many years that the companies that sustain success develop decision disciplines that are an integral part of their culture. They institutionalize decision processes, with everyone on the same page in terms of a focused business mission and very clear rules and expectations about decision-making priorities, responsibilities and authority. Yet, decision-making is a topic less and less addressed in the management literature, both scholarly research and work aimed at business practitioner. Pick up any leading book on business strategy, leadership and teams – the main topics of the current bestsellers – and look up “decision”, “decision-making” and “decision process” in the index. Only very occasionally will you find an entry. The writers use a different lens than ours to scan the business landscape; that lens largely filters out decision-making as an area of focus. Perhaps this filtering reflects the Good Times of the last decade of the past century. In a stable and predictable business environment, decision-making is relatively straightforward; forecasting and planning,
19
Earle, N., Livermore, A., and Keen, P.G.W. “From .Com to .Profit: Inventing Business Models that Deliver Value and Profit”, Jossey-Bass, 2001.
32
backed up by first-rate execution, are the main forces for business success. In a rapid growth economy, decisions are often easy to make – just move and move fast; companies can afford to waste resources and make mistakes in a benign environment that does not punish them for doing so. Executive leadership in this context means mobilizing the organization for commitment and execution; that of course is not straightforward, but the more clear and attractive the future path, the simpler it is to point the organization in the right direction. Decisionmaking is then tactical, operational and opportunistic. Uncertainty and volatility of environment remove that secure base. There has not been a single quarterly period in this new century where any of us can reliably forecast key social, political, economic and financial factors even six months ahead. Companies have to make more and more decisions in a more and more complex environment. Rehearsing the future then means using every tool and skill available to take charge of change rather then just react to it: to make decision processes a proactive rather than reactive organizational response. It involves vision in the widest sense of the word – envisioning, a business vision, insight and foresight. It demands decisiveness. It rests on anticipation, exploration, scenarios, and simulation and evaluation of potential futures. It cannot be handled by one person or one group operating alone; business design for uncertain futures requires collaboration across more and more stakeholders and allies. The innovators and change leaders will be the ones that are able to make effective decisions in this environment; and the ones that most effectively exploit the combination of humans and computers working together.
The Target: Decisions That Matter Here are the characteristics of decisions that matter (DTM). They are: ¾Urgent: You can’t put off making next year’s investment plan by waiting to see what happens to the economy six months from now. Never in modern business history – quite literally – have organizations in the public and private sector been under so much pressure to make major moves under constant stress and time duress as in the first years of this so far deeply troubled and disruptive century, with its record of recession, wars, terrorism, massive company accounting frauds, the sudden collapse of previously thriving industries such as telecommunications, and strains on government finances and services. Time is not on the decision makers’ side.
33
Waiting change out is not an option. Any decision that significantly affects customer relationships, market strategy, human resources, corporate financing, product innovation, service, and operational efficiency becomes a DTM when there is no “slack” available; slack refers to a free resource that can be safely wasted. An example of such a DTM and one where we can offer an effective DE recipe is airport planning.20 (Chapter 3 presents the DE studio for this.) Decisions on whether or not to add a new runway or terminal can take a decade to negotiate which cost hundreds of millions of dollars and involve many stakeholders. They often rest on highly uncertain and even conflicting forecasts of passenger growth, industry developments, economic, etc. The problem is that as airport capacity gets overloaded and facilities strained, the decision has to be made now, however long it may take to implement. ¾Consequential: Some decisions have a wide range of “adequacy.” There is room for error, time to make adjustments, and limited downside risk. With others, though, the organization will have to live with the consequences of the decision with no opportunity to adjust, fine-tune or “play things by ear.” Instances are the choice between candidates for a top management position, a downsizing, an acquisition, or a launch of a major new product. Another DE recipe that we present in Chapter 7 has been successfully applied in a number of companies where the DTMs included engineering “release” – signing off on new product designs, authorizing warranty repair claims, and making special quotes for customized (and very expensive) products. In all of these cases, the decision process and outcomes tied up dozens of departments, took months, and were often a source of in-fighting and crisis management, because the results really do matter. The recipe slashed time by 40-60 percent and improved cost and quality proportionately; the DE services contribution here was to transform the decision process through a suite of modeling and design
20 Recipes are domain-specific processes for effective DES practice that are (1) proven, (2) written down, (3) precise in their statement of ingredients and sequencing but with room for adaptability, flexibility and innovation, and (4) get used. Examples of recipes we present in later chapters of Rehearsing the Future are simulation suite development, process management transformation, and collaborative studio design and application.
34
tools and a facilitative studio focused on business process mapping. ¾Non-avoidable: The buck stops with you – the decision makers. Consultants, planners and subordinates can all contribute to the process but it is not responsible to delegate it to them, directly or indirectly. Decisions that involve policy, authority and organizational change are obvious examples here – with “politics” too easily used to explain dysfunctional results. In many of the instances where the authors and our colleagues worked on projects that have shaped Decision Enhancement Services, it has to be admitted that the buck often got passed on. It was passed on because of the decision support tools that we used! Models got built and presentations made and that was it; debate and what is often termed “analysis/paralysis” substituted for action and commitment. We realized that a major need in effective decision-making is to ensure that all key stakeholders are included in the process; instead, they often sit on the sidelines, literally out of the loop. This has been a major factor driving our invention of the DE studio and our emphasis on visualization. If people are excluded from the process, they do not contribute to or trust the models, or are unable to make sense of the results, then obviously, the process will drift. One of the goals of DE studio and suite design has to be to encourage involvement that leads to commitment rather than avoidance that leads to delay and in some cases a purely “political” decision imposed by a subset of stakeholders. ¾Non-reversible: Once you launch that product, you can’t simply unlaunch it. If you pick the wrong reorganization plan, you’re stuck with it – or with huge political problems. This non-reversibility of decisions that matter is often the reason for decision avoidance and drift; key parties just do not feel comfortable in committing to a decision even though they are fully aware of its importance, urgency and consequences. The risks of inaction may be less than the risks of action from their viewpoint. The old adage of “look before you leap” applies here. The DE approach is “let’s help you look – rehearse the future – so that it will become a little easier and more comfortable to leap.”
35
¾Packed with uncertainty: Forecasts are all over the place, customer confidence goes up and down, the political situation looks dicey, and the business case is heavily dependent on assumptions. Uncertainty in DTMs translates to risk and in many instances to betting the business. DE services example in the airport that we discuss in Chapter 3, forecasts of future traffic are notoriously unreliable and recession, the shakeup and even shakeout of many airlines. Additionally, the impact of low-cost competitors to the main carriers have compounded the problem. The traditional analytic approach of experts and model-builders is to gather information and build simulations to come up with a recommendation or a consensual forecast. The DE alternative is to build technology suites that facilitate rapid “what if?” analysis, trying out – rehearsing – scenarios and assumptions and showing visually how the scenarios – the futures – play out over time and as parameters are adjusted (industry competition, impact of low price airlines, the switch from or to larger planes, etc.). DE services add the studio to the suite and involves as many experts, stakeholders and toolsmiths as is practical so that even where no one can reliably predict the future, they can together reach a meeting of minds through a sharing of sights and insights – visualizations. Visualization here is a move from the “what if?” of decision support and spreadsheets to “what will it look like?” ¾Wicked: There is no “right” answer and all the analysis in the world will not produce one. You (as an individual and/or as a group) have to decide. What makes many problems wicked is conflict of values and the difficulty of making trade-offs. For infrastructure decisions, this can become almost overwhelming. What is the trade-off – and whose assessments matter most – between, say, the economic regional or national development that airport expansion will enable and the noise and traffic congestion that it will create? Who should bear the cost? Passengers? Carriers? Government? Carriers? Operators? Having a DE studio in place in no way makes it easy to deal with the wicked element of DTMs, but it does at least provide a forum for building mutual understanding of views, shared evaluation of the impacts of scenarios, and collaborative efforts to come to an agreement and a commitment to
36
follow-on action. In other words, it improves the process, which is what decision enhancement is all about.
The Context: A New Generation of Just about Everything Decision-making is the primary function of the executive. It is the core of innovation and of taking charge of change, by the very fact that both of these require an active move away from the status quo. Decisionmaking makes that move. The more complex, volatile, risky and challenging the environment, the more that the effectiveness of decision processes makes the management difference. The language associated with decisions captures this importance: ¾Definitions from the Oxford English Dictionary: • Decision: come to a determination or resolution; • Decisive: unhesitating, resolute, determined; • Decidedly: (a) definitely, in a manner that precludes all doubt; (b) resolutely, unwaveringly; • Indecision: the inability to make up one’s mind ¾“What you decide shall be irrevocable.” (Marryat) ¾“Decision is (Fitzgerald)
torment
for
anyone
with
imagination.”
¾“Decisions fail half the time.” (Nutt) ¾“As soon as questions of will or decision or reason or choice of action arise, human science is at a loss.” (Chomsky) The uncertain, indecisive, hesitating, irresolute organization at a loss is hardly positioned to navigate the future. Surely, turning these negative words into their antonym positives ought to be the primary target for using technology. Change, uncertainty and volatility have become the norm, not the exception. The consequences of effective versus ineffective decision processes are just as immense and growing; there is less and less room for error in making choices today to be positioned for tomorrow. The subtitle of our book can be extended to “building decision agility for our new generation.” We are in a demanding new generation of just about every element of business and organization. This is not a next generation: the future really is now:
37
¾Business: inter-organizational dependencies, alliances, value networks, erosion of the public-private sector dichotomy, the vulnerability of even the largest and strongest companies to suddenly lose their edge and even their survival. There is no status quo to maintain and “managing” change is rather like trying to manage gravity – gravity pays no attention but just pulls on down. ¾Decisions that matter: an environment of unprecedented complexity, volatility, competitive pressures, pace of change and uncertainty, where the future is constantly at risk and today’s organization and business little more than an ice block ready to melt. ¾Executives: enlarged responsibilities, demands for leadership, collaboration as an imperative, and a recognition which is the issue for organizations everywhere now is not whether to innovate, but how to do so. ¾Technology: multimedia interfaces, data resources, websites, intranets and new web services, networks, mobile communications, development tools, visualization aids – and of course many more acronyms. For many people in business, the web may seem like an avalanche of technology. For those who work in and with it, what has happened over the past decade is a relatively slow glacial flow compared with what we see up ahead on the next slope down the mountain. Without pushing the metaphor too far, the message here is that it is far better to be on skis – DE services – than snowshoes – and to be able to enjoy the ride. ¾Organizational arrangements: networks rather than hierarchies and functional departments, geographically distributed teams, the virtual organization as a reality rather than a neat concept, co-sourcing/outsourcing partnerships in core business processes, constant reorganization, “empowerment”, “reengineering” and “restructuring” as euphemisms for “you’re fired”, and “think global, act local” as the watchword (or is it now the reverse?) ¾Knowledge resources, distributed information, needs for mobilization and sharing, and the team as the flexible and adaptive unit of work. We may live in an “information”
38
society but we remain short on knowledge, knowledge sharing and the ability to support and encourage people with knowledge and turn their knowledge into creative action. An old adage attributed to HP is “If only HP knew what HP knows.” We can restate it in the DE context as if only our own organization used what it knows and if only our collaborations with our many partners in multistakeholder decision-making could bring in and share what each of us separately knows. ¾Economic demands: demands for shareholder value and capital efficiency, immense pressure on revenues and margins, constant cost tresses and budget crunches. It may take a decade for businesses to recover their financial health after the disasters, disruptions and recession that began in this new century. In the meantime, there is hardly any industry where margins are increasing, where there is not overcapacity, or where globalization does not erode protective and oligopolistic barriers to competition, innovation and outsourcing. There are entirely new economic challenges including the accession of Eastern European nations into the European Union, continuing deregulation, the emergence of global outsourcing, and the breakdown of established industry boundaries. ¾Society: In this new century, everything in society seems different, nothing certain, and the future full of risks, including the risk of ignoring key trends until too late, misreading of the landscape, and inaction. In the meantime, organizations in the public and private sector must plan in a world where they cannot predict and take action in a situation where there is no longer any safe haven. In that context, decisions really do matter, in that in stable times it is easy to predict and plan. On the upside of the business cycle any company could make money. With public funding available, any government agency can continue along its established path. And decision makers can get by simply by doing what they did last time. They could continue to plan without testing and changing their basic assumptions and decision processes. They could evolve from and continue the past instead of rehearsing the future. Not now. Welcome to the new generation of just about everything. Each of these bullet points could be expanded into entire pages. All of them add new challenges and opportunities for decision-making.
39
The Three “U”s of Decision Enhancement Effective Decision Enhancement rests on three “U”s – Usefulness, Usability and Usage. ¾Usefulness relates to analytic methods, information resources, and embedded knowledge. It might be described as the thinkware that goes with the hardware and software that embodies it. ¾Usability largely depends on the interface between the user and the suite of DE services technology: its natural mesh with the way they think and work, its responsiveness and adaptability, and the ease of interaction and collaboration it enables. ¾Usage depends on the embedding of the studio and its suites of tools in the decision process itself. Without over-stereotyping computer science, information systems, operations research, data warehousing and related applications of information technology to decision making, in general they have at most achieved two out of the three U’s and in many instances just one of them. There has been a broad gap in the fields that provide useful methods between that component of value and their relative lack of usability that makes it hardly likely that they will see much organizational use except in consulting and research studies; that is a polite way of saying that an elegant algorithm, however useful, is not a usable tool. And equally a neat graphics package that provides dancing pie charts is not useful. More consequently, if algorithm and graphics do not get used, they are a waste not just of time and money but also of opportunity. DE is fundamentally opportunistic: make what is useful in, say, economics, financial planning, computer science and public policy useable and get it used. Add useful analytic, intellectual and organizational methods to the useable and widely used Web/PC/PDA base. DE places equal emphasis on all three U’s. Traditional DSS has seen an increasing gap between its usability – interaction, speed of development, low cost of prototypes, etc. – and usefulness. When just about any professional or manager in any field can knock up a quick spreadsheet, generate a fast report from access to a database, and link to more and more Internet-based services, the term DSS largely becomes redundant; it adds nothing in itself to this now routine element of information systems and their widespread use.
40
This increasing dilution of Decision Support in terms of impact, identity and interest has largely come about because it lost focus on the “D” of Decision Support. As, first, spreadsheet software and then a flood of new packages, the Internet and intranet-based services made it quick and easy to build small-scale applications, Decision Support tended to lower its aspirations and emphasize usability at the expense of usefulness. Originally, DSS were model-based and much of their impact came from making the many useful tools of simulation, optimization and statistical modeling accessible for interactive use by decision-makers. There were a few large-scale database management systems that DSS could use to access information. Telecommunications were in their early years of business deployment and expensive and slow. DSS had brought a unique degree of interactivity to analytic methods and modeling capabilities that stood out from the practices of such disciplines as computer science and management science because of this new usability. Now, usability is routine for spreadsheets, the Internet and telecommunications services. It’s time to put the “D” back in Decision Support. The DE focus on decisions that matter provides for this. The fields of management science, operations research, statistical methods, expert systems, data communications, software engineering and many other disciplines with an analytic or advanced technology focus have made, continue to make and always will make many contributions to the Usefulness factor and hence to “D.” But very rarely has usability been the hallmark or even concern of much of their innovation. For all the talk of “user-friendliness” and “human-computer interaction” far too few of the algorithms, models and statistical routines they generate end up being truly usable. If they are not usable, they do not get used by busy managers, skeptical of analytic and technical specialists’ understanding of business and of their ability to collaborate with them in design and application of all this often-intimidating invention. The DE contribution here is two-fold: preserve the priority of usefulness as the driver of the “D” but focus on exploiting the new generation of tools and methods that provide natural user-system interfaces that are multimedia-based, visual in their displays and easy to define, develop, use and adapt. The interfaces of today – screen menus, Web browsers, query tools, spreadsheets, etc. – are just the first wave of a new generation of use-friendly rather than user-friendly capabilities that are built around four emerging clusters of technology. These are (1) visualization via multimedia, (2) a building block approach to systems and service design – and redesign – (3) a dramatic expansion of the reach of the technology to decision-makers via cross- and interorganizational telecommunications and mobile wireless computing, and
41
(4) data warehousing and data mining resources – inventories of data that are easily accessible. Multimedia – animation, dynamic visual representation, color, sound, and pictures – is the most immediately important of these four innovation enablers for DE, simply because it opens up so many new ways for people to see, think, learn and share ideas. Multimedia is as dramatic an extension of the Usability of information technology as was the Web. As one commentator states: humans are visual, verbal, literate and numerate – in that order of importance.21 Historically, information technology was of necessity limited to supporting numeracy; it began as data processing and “information” meant numbers. Such constraint is now released and Decision Enhancement exploits multimedia aggressively to make systems not more user-friendly but more human-natural and as an integral part of facilitative and collaborative studios. That brings into the DE space of research, practice and skills the arts as well as the sciences; it moves “systems” and “suites” from meaning programming and development to design. We need to learn from and apply the very best of industrial design, service design, and arts design. The substantial improvement in the usability of computer systems obscures the fact that there is a long way to go before, say, the average Web site matches the average consumer electronics product in terms of meshing form and function. Most largescale Web sites are computer programmers’ products, not designers’ ones. The difference between programming and design is an important one for the success of DE; basically, it reflects a difference in focus between function – what the DE “system” does – and form – how it feels and looks. For the system to become effective within the decision process, it must be designed as a service, not as a product. Usefulness of methods, models and information plus usability of technology suites do not guarantee usage. It is a cliché that organizational, social, political and psychological factors generally explain the failures of excellent systems and IT infrastructures to become effective and disseminated in everyday use. Today, when e-mail is as routine as telephone calls and the Web as ubiquitous as ATMs, it is easy to overlook that it took several decades for e-mail to take off. Apparently successful pilots fell into disuse even though the early results were encouraging. The Web is widely used but not for wide-scale Decision Support; after several decades of successful development of useful and highly usable tools for team collaboration, most business uses of Web Internet and intranet services are for simple information access and communication. 21
Arnheim, R. “Visual Thinking”, University California Press, 1989.
42
The vital importance of supporting decision processes, relationships, and human and organizational factors has always been recognized by the DSS field, which introduced many now standard approaches to prototyping and the use of facilitators and experts to both guide DSS design and use – especially in group processes. The behavioral side of Decision Support has dominated much of the research that complements its technical and analytic practice base. Somehow, though, we remain after twenty years in a situation where DSS are not embedded in managerial practice. Our Decision Enhancement frameworks, studios, suites and recipes help build a new field of effective practice.
Information is Not Enough: The Interface is the “System” There is just one thing that makes Decision Enhancement distinctive, needed and relevant to business innovation: decision-making. This, not “information” or knowledge “management” or “communication”, is where the technology base of the Internet, PCs, databases and networks can add value to organizational leadership in taking charge of change. We routinely talk about information technology as an organizational resource but less so does decision technology get the same attention. Yet information is only as effective as its uses and the most effective uses will be targeted to the decisions that matter in an organization, the ones that most affect its future. Indeed, the only reason to gather data is to make better decisions. In our view, the many disappointments and underutilization of ever-more available technology reflect a separation of information from decision-making. Information in and of itself has no value without this linkage that we term Decision Support. Decision-making for decisions that matter involves going beyond being better informed. It rests on people assessing future choices and their impacts and then making a commitment to act. Committing is the end point for decision making; ideas, recommendations, suggestions, opinions are just an extra contribution of “information” and being better informed. They are decision enablers, inhibitors or inputs. They are not decisions. In order to make decision commitments, teams need confidence in themselves and in the decision process, collaborative trust, and information. The more that they can confidently, collaboratively and knowledgably rehearse the future, the more likely they are to reach the best decision that they can. In many instances, DE adds value to information that is already in accessible form. It does this through DE services in two main ways:
43
¾Via the studio, which establishes a context and agenda for using information and builds the studio as a service to enhance the decision process. ¾Via the interfaces that the suite provides. To the user of any interactive computer-based tool, the interface is the “system.” People make decisions. The DE goal is help to help them make better decisions – not make better decisions for them or remove decisions from them.
44
Chapter 3
Decision Enhancement in Action: The Airport Development Studio Overview Before moving on to describe in detail Decision Enhancement principles and methods, we present in this chapter a more extended example of a DE studio in action than the short vignettes that we have provided so far. We chose the most complex of all our DE applications. It is one where the collaborations among our many colleagues have built powerful new tools in terms of suites, studios and applications. The decision arena that we describe is airport infrastructure planning, very much a decision that matters. The technology suite is a set of simulation modeling tools linked to central databases, legacy systems and other software. The DE services focus is on multi-stakeholder collaboration and scenario evaluation. The decision is airport expansion. Figure 3.1 below shows how the technology and systems development base of DE services (its suites) link to its support processes and methods (studios) which in turn link to the broader business, organizational and economic context – the governance architecture. The case example illustrates them. Here is a brief summary of the elements of the framework: Suites are the set of technology resources and methods that are mobilized for Decision Enhancement. That means that suites (1) ensure rapid development of prototypes, (2) can scale quickly and easy to go beyond prototypes, (3) are flexible and adaptive so that the system can evolve and change with the decision process and situation, and (4) ensure that the systems they build meet the imperatives for effective Decision Enhancement: a direct and simple meshing with how decision makers think, work and communicate. These criteria both narrow down the choice of DE suites from among the rich range and variety of potential IT tools and – more consequentially – determine how those tools are assembled, mobilized, packaged and prioritized.
45
Figure 3.1: Suites, Studios, and Architecture of DE Services
Studios are the environments in which suites are deployed. We chose the term for its analogy with a television or an art studio. It is an interactive environment with a clear purpose – the production. An appropriate combination of lead announcer, chairperson, teacher/coach, facilitator and content expert guides and coordinates the production. There may be an open-ended discussion agenda (most suited in DE studios to brainstorming) or a detailed sequence of scenes and script (For DE services, this is a “recipe” of proven methods). It includes time frames, production support, and tools – the suites. It is highly visual and conversational and very much dependent on the co-design process in the production that demands many different skills, specializations and roles. It is very purposive: to generate the best production within the constraints of cost, time, program, and topic. Obviously, this analogy should not be pushed too far. It evokes just what decision “enhancement” must parallel to be effective. The value of a studio rests on its processes. These processes fundamentally involve people and collaboration. The term studio is a needed reminder that while the information technology equivalent of cameras, lights, animation displays and teleprompters – hardware, software, PCs, etc. – are essential to the production, they are not the production; they enable it. Nor is the director the star. Nor is the “talent” the executive producer. And so on. Studios rest on processes of co-design and collaboration that balance flexibility and discipline. So do DE services.
46
Governance architecture is the blueprint for embedding studios and integrating suites within the enterprise decision-making activities at all levels. This is analogous to a city’s infrastructure, planning codes, ordinances, and services. Some of these are regulations, some are guides and some are shared facilities. For DE to be most effective, it needs a comparable framework. Without it, the models, systems and mini-studios it develops are very much “standalone” and specialpurpose. In too many instances, they lead to “one-off” pilots that do not scale up to handle more users and uses. They do not link across the organization and with other organizations. They lack a clear set of targets for investment, payoff metrics, and priorities for application. They quickly reach limits of coordination and hence of impact. This is one of the main reasons why useful and usable systems so often do not get used outside their initial community and decision setting. The components of an effective governance architecture for DE begin with landscaping. This is our term for the assessment of the organization’s business vision, time horizon, partnership strategies and imperatives – the “must do”s on the immediate business agenda. It is key because this defines what the decisions that matter for the organization will be. Landscaping answers such questions as how far out are we looking – what is our time horizon? How broadly are we looking – just at our enterprise needs and plans or across our supply chain, customer and partnership “value” networks? What are our operational priorities? Financial priorities? The organizational priority for the business architecture is to define policies with teeth. By this, we mean policies that go beyond vision statements and guidelines and that have the force of organizational law. They include rules for governance of decision-making, including definitions of processes for ensuring that the many decision traps we discuss in Chapter 4, “What We Really Know About Effective Decisionmaking”, are avoided, and that there are clear criteria for identifying decisions that matter and for involving relevant stakeholders in them. The policies also include blueprints for ensuring that the suite components of a DE studio are integrated with each. This largely requires technology standards and development methods. We describe the airport planning case and context in the following sequence: 1. Landscaping the decision context: issues, time horizon, priorities. Airport master planning involves a time scope of 10-20 years ahead. It increasingly must deal with uncertainties and dynamics outside the airports’ own
47
control.22 The DE issue is what the appropriate landscaping perspective is in terms of time horizon and breadth of view. 2. Governance: stakeholders, networks of actors. Here are just some and decidedly not all of the stakeholders: the airport operator, passengers, government authorities, airlines, competing airports, domestic regulators, international regulators, local residents, security agencies, freight carriers, non-aviation businesses……. The list goes on. Obviously, it is impossible to include representatives from every interest group in the master planning process, but excluding their viewpoints can be a guarantee of later problems in mobilizing to implement the plan. We later review the famous Denver International Airport decisionmaking and implementation fiasco, which was almost guaranteed through inattention to decision governance. formal procedures and requirements, 3. Blueprints: technology resources. Blueprints are the frameworks and guidelines for making the decision problem manageable and meaningful. “Go and get us a master plan” is not at all the same as saying “Here is what we are looking for in the decision process that will move us towards an effective master plan” and “here are the policies, resources and systems that will make sure that all the technology used in the process fits together.” 4. The studio: design and process principles. The studio is the core of the decision process. It is the result of many years of work in both airport applications and with simulation tools that lacked adequate studios and were dominated by “systems.” They offered decision tools, not decision processes. The studio has also emerged from process-centered Group DSS tools that enabled collaboration and scenario-building even though they did not address the substantive and complex analytic and information elements of the decision context. Fusing these two approaches naturally results in a DE services studio. 5. The suite: toolkits, integration methods. The Airport Business Suite (ABS) is the foundation for meshing
22
Kohse, U. “Supporting Adaptive and Robust Airport Master Planning and Design in an Uncertain Environment”, TU Delft, 2003.
48
technology and process.23 It is a set of discrete building blocks that are designed for use in airport studios that have included DE services to Schiphol in the Netherlands, JF Kennedy in New York, and several German airports. The logic of the building blocks is that they support recipes – repeatable, reproducible studios. They are domain-specific and center on the tasks, entities and information common to airport planning. The decision context establishes the building blocks and the technology suites add customized links, routines, models and information. Making ABS useful, useable and used posed very complex technical challenges, that are being solved through careful attention to the design of the user interface and methods for linking to other systems, including databases, spreadsheets, other simulations and processing systems in order to ensure flexibility, adaptability, rapid updating to the suite, and a variety of visual, numeric and multimedia output displays. 6. Lessons for turning the studio into a DE recipe. Progress, problems, results and implications for practice. The case is a composite of completed and ongoing DE initiatives. We have chosen to disguise the identity of the organizations and decisionmakers involved in the studio, for a variety of reasons. These include protecting their privacy, being able to make candid comments and assessments, and gaining the opportunity to tailor the discussion to our topic, rather than producing a pedagogic “case.” We vouch for the accuracy of the basic facts, with minor changes to the real situation made in order to preserve anonymity.
Landscaping the Decision Context and Situation We begin our analysis by placing the Hartsmartin airport literally in its landscape, using the Netherlands as our illustration. Our composite case includes suite components developed for and applied in a number of projects in the Netherlands, including Schiphol Airport, the Port of Rotterdam, the Dutch Flower Auction market (the biggest agricultural 23 Walker, W.E, N.A. Lang, J. Keur, H.G. Visser, R.A.A. Wijnen, U. Kohse, J.Veldhuis, A.R.C. De Haan (2003). An Organizational Decision Support System for Airport Strategic Exploration, In Tung Bui, Henryk Sroka, Stanislaw Stanek, and Jerzy Goluchowski (eds.), DSS in the Uncertainty of the Internet Age, pages 435-452, Publisher of the Karol Adamiecki University of Economics in Katowice, Katowice, Poland.
49
auction in the world) and the Dutch Police. It also includes applications and experiences from airport development planning suites and studios in the U.S. and Europe. We use the Netherlands as illustration to capture just how critical, urgent and consequential this decision is. As with our vignette examples in Chapter 1, the specific decision-making activity can too easily be seen as detached from its landscape. The decision situation can take on a life of its own and lose sight of the bigger picture. In the insurance firm, vignette that we showed in chapter 2, for instance, risk management was just a bureaucratic burden to many of the participants in the process. In the wider landscape overshadowed by Enron, it is, however, a key element of enterprise governance. The participants in the risk management meetings largely lost sight of this context and concentrated instead on the meeting agenda. The suites and studios developed and applied in the Hartsmartin airport case equally are not just a simulation and meetings. The decision process they aim to enhance is highly consequential for a nation, region, state or province, or city in terms of economic development, social impacts and politics. Let us take the case of the Netherlands. While the country is small, its decision arenas are huge. As a trading nation, the Netherlands must strive to maintain its position as a transportation and cargo hub and ensure that its road, rail, airport and port infrastructures are world class. The stakes are high and the stakeholders are many. Schiphol airport, the Port of Rotterdam, the Dutch flower auction market and transportation links are critical for economic development. The decision context is intrinsically multi-actor and multi-stakeholder in nature. In the airport example, the actors in the decision process include airport operators, airline planners, architects, facility designers, construction firms, local, regional, national and international agencies, road transportation planners, and many specialists in environmental control, financing, security, etc. Many stakeholders are not directly involved in the details of planning, but their views and priorities must be included. Examples are local residents, security agencies, passenger associations, and import-export businesses. What is also important to take into account is the landscape in which specific airport infrastructure decisions of the nature and scale of the Harstmartin studio fit: urban and regional development, national economic policy and globalization. Decisions that matter almost invariably matter because of their context. It is that that makes them so often wicked. In our view, wickedness is one of the reasons that top-
50
level decision makers – senior executives and government officials – become very vary of simulation models, consulting reports, forecasts and detailed financial analyses. As the context expands, the more that the values, preferences and influence of more and more stakeholders can invalidate or weaken the impact of airport planners’ assumptions and recommendations. One major reason for embedding technical decision aids such as forecasting tools and simulation models in a DE studio is to ensure that the context is not lost in the details of the technical work. If we zoom in on Amsterdam (Figure 3.2), what comes into view is perhaps the key single location in the Netherlands in terms of its development of infrastructures for competing in the broader context of world trade, cargo and air travel: Schiphol airport. Schiphol is the Dutch hub. It competes with Heathrow, Frankfurt, Charles de Gaulle, and many smaller regional airports. The photo below is disarmingly simple, just a bunch of planes at gates, with the countryside in the far background distance.
Figure 3.2: Zooming in on Schiphol Airport
The simplicity of the picture hides the complexity of the decision arena for Schiphol: whether or not to build a fifth runway, consideration of building an artificial island extending out to the North Sea, and deciding how to respond to major airline developments, including the acquisition by Air France, operating out of its Charles de Gaulle hub in Paris, of KLM, the Dutch flagship airline.
51
The decision-making process for airport planning can take decades and billions of dollars – Euros in this case. It will involve many policy studies, detailed forecasts, public hearings, simulation models, political debates, and forecasts of traffic, cost and economic, social and environmental impacts. Can DE really contribute here? Each of these components of the decision process ranges widely in terms of method, priority, information and many other factors, to an extent that altogether it seems unlikely that they can be easily integrated. It is the DE focus on the multi-stakeholder and multi-actor nature of decision arenas like the Hartsmartin domain that provides our answer to this question: “Yes, Decision Enhancement is making a distinctive contribution already.” We can give a flavor of DE simulation suites through the images below; the website associated with Decision Enhancement Services shows the simulations in action: the flow of traffic through the airport, the arrival and departure of planes, and even the animated impact of changes in procedures for passengers to board airplanes. All the representations come from the very same simulation models; the displays only look different because the suite is designed to respond to the different needs of stakeholders for different types and levels of representations. The first example walks the viewer through a proposed configuration of the airport. It is an animation derived from the simulation, not a separate, special-purpose video. As the model changes and new scenarios are defined and evaluated, the animation is automatically updated.
Figure 3.3: A walk through a simulated airport (extract from an animated model)
52
This visualization obviously is of distinct value in helping some stakeholders get a sense of the dynamics of the airport, but not others. Passengers, architects and perhaps executives in airport shopping firms will find this type of display evocative, informative and helpful. Airport operators need a different representation of the flow of airport traffic, such as the one shown below (again, this is a static image; the Decision Enhancement Services CD that accompanies our book shows the dynamic simulation output).
Figure 3.4: Simulation Model of a Gate24
Policy makers need an entirely different visualization that captures their own perspective, role and contribution to the decision process. The image below shows the noise pollution impacts of one “what if?” scenario. Decision makers can quickly use this “noise contour” animated display to address such questions as “what will be the impact of changing the routes of airplanes as they approach or take off in order to reduce noise levels in particular neighborhoods and cities?” and “if we were to impose a fixed policy on noise contours, how can we optimize other variables (cost, time, congestion, etc.) in our operations?” At a more operational level, the decision issue might be “Can we change the routes of aircraft to reduce noise in sensitive locations, such as cities?” 24
Source: Sol, H. “Decision Support for e-Governance”, 2002, University of Nebraska, IAADS Summit symposium
53
Figure 3.5: Noise contour images around Schiphol25
One of our ambitions for applying Decision Enhancement services is to help organizations link the simulation level of a studio/suite combination up through the many levels of decision-making, from operations to planning to policy to the wider landscaping of innovation. The simulation rehearses an airport future. It should be part of rehearsing a regional and national future. This is the challenge for decision process governance and architecture.
Hartsmartin Airport Hartsmartin is a medium-sized international airport hub. It handles half a million flights a year, with forty million passengers. Its home market is small but it links to over eighty continental and intercontinental markets. Forty percent of its passengers are transfers who fly on to another destination. Its cargo traffic is fairly small. Its economic success rests on its remaining a hub of choice for both airlines and passengers. Hartsmartin faces growing pressures. Recent high growth rates in traffic, mainly generated by the increase in flights operated by low-cost airlines, 25
Source: Sol, H. “Decision Support for e-Governance”, 2002, University of Nebraska, IAADS Summit symposium
54
have put its quality – and reputation – under strain. Congestion and delays continue to increase, to the degree that Hartsmartin now has to allocate airport slots; that rationing in itself creates new complexity and new stakeholder pressures. (It is a fairly routine for national and international carriers to lobby through government trade representatives for “fair” treatment in getting slot landing rights in foreign airports and for local ones to try to protect their existing preferred status.) Industry volatility and stress add issues and actors. Major hub carriers are demanding fee reductions; a number of them face cutbacks and even bankruptcy in the unprecedented turmoil of the airline industry worldwide. The national government is discussing privatization of airports and is reducing subsidies to them. Consequently, Hartsmartin has had to make many cost cuts over recent years to respond to this situation. Meanwhile, government has increased regulation of environmental impacts, including pollution control and limiting of noise levels. Management knows it must expand the airport, but the Executive Board is divided on what the priorities should be for this. Investment decisions for airport infrastructures involve multimillion and even multi-billion dollar and Euro commitments, (even the currency exchange rate between the dollar and Euro is a factor that affects air transportation planning) are intensely political, generate controversy and can take well over a decade to get agreement and commitment. The central problem in airport planning is the cost and time scale and the many “X” risks involved. X here stands for the unknown – you can substitute many words here: financial risk, forecasting risk, traffic congestion risk, environmental risk, social risk, etc. Infrastructure decisions concerning new runways, expanded terminals, cargo facilities, and transportation involve complex trade-offs; the complexity is compounded by the difficulty of forecasting; deregulation, privatization, globalization and competition are all factors that are dynamic, volatile, uncertain and interactive. Furthermore, the rules of the game are changing along the route: tomorrow’s allowed noise levels are for instance different from today’s, while decisions on expansion and infrastructure that will cause a certain noise level have been taken a long time ago. Those decisions may have to be revisited and unplanned changes in policy imposed. Similar dynamic problems hold for safety, quality, security and financial incentives. These time-lags and dynamics make airport planning and decision making a highly complex job. Hartsmartin’s airport managers know that they must make a decision on expansion within the coming two years. The airport DE studio was built to help them in the process. The participants are mainly middle and senior managers involved in airport operations, various specialized functions and sectors such as cargo and business charter services, and
55
representatives from key stakeholders, including the hub’s main carriers and local, regional and national government. Their overall decision focus is on how to operate and develop the airport to handle both current and future traffic cost-efficiently while at the same time meeting social and environmental requirements. The main trade-offs here are between growth and sustainability. There is a mass of decisions to be addressed within this overall meta-decision. Just a few of the questions under constant active discussion are: ¾What parts of the airport will have to be rebuilt, extended or altered if the traffic mix changes? A switch in the ratio of aircraft types – new types of planes, jumbo jets, short-haul airplanes, commuter planes, etc. – means different airport facilities are required, including check-in procedures, baggagehandling, maintenance, etc. A switch in airline industry structures – major carriers merging, reducing flights, or altering their hubbing strategies, and low-cost airlines flooding the market – impacts every aspect of the airport operations that affect every stakeholder: passengers, local residents, airlines, government regulators, airport operators and many other parties. ¾Given that the most likely long-term scenario is a continuation of high growth in passenger traffic, should Hartsmartin invest in a new runway? Are there alternatives to this costly and lengthy process? What would be the environmental impacts? (Airport expansion planning processes are highly regulated and occur in a very public arena. The decision process to build a fifth terminal at Heathrow in the UK has taken over a decade of public hearings and remains unresolved even as it is being built, with large-scale anti-expansion demonstrations the headlines of mid2004 (as they were in 1994.) ¾Should Hartsmartin concentrate on meeting the wishes of its national home carrier or try to attract additional traffic? Options here include boosting low cost, shorthaul carriers catering to the leisure market, expanding Hartsmartin’s cargo capacity and services, and adding non-aviation business to reduce its dependency on its major airline customers. (Schiphol in the Netherlands made cargo its most profitable service. Heathrow is in many ways as much a shopping mall as an airport.)
56
Answering these questions rests on one single, central problem: forecasting traffic demand. Typically, the decision process is dominated by an assessment of demand based on a single forecast of the most likely future, derived by extrapolating from past trends. This “master model” forms the blueprint for every other element of planning. The difficulty here is that there is no single most likely future. In the old days of largely regulated traffic, handled by a small number of large “major” carriers – United, British Airways, Air France, Lufthansa, KLM, American Airlines, etc. – linear extrapolations were reasonably reliable. An airport could assume, say, an annual 2.6 percent increase in passenger and phase its investments accordingly. In the new dynamic, uncertain and unstable environment, the future is impossible to predict. There is no master model any more. More consequential for decision-making is the reality that the authors of probably the leading book on the topic summarizes as “The forecast is always wrong” – even in more stable times.26 They review of the fiveyear forecasts of U.S. traffic made annually by the Federal Aviation Authority (FAA), over a forty-year period. The difference between the forecast and actual level of traffic was as high as 80 percent and half the errors were over 20 percent. They were under 10 percent in less than a third of the forecasts. Estimates are as likely to be too low as too high. One contribution of the Hartsmartin DE studio to airport planning centers is on resolving this problem of basing modeling on forecasts that are very unlikely to be accurate and whose degree of inaccuracy cannot be reliably assessed. Rather than trying to make predictions, the studio models scenarios – rehearses potential futures – and shows the decision makers their impacts on costs, capacity, environment, etc. Instead of the traditional modeling strategy of looking to find the most likely forecast and then applying it to such planning issues as costs and capacity needs – assessments of which are very sensitive to the accuracy of the master model – this “what if?” approach is to ask if we assume Forecast X, what would be the impacts on cost and capacity? Does it significantly change our decision options? How likely is this scenario? Hartsmartin’s airport planning department commissioned the studio. Its managers saw the opportunity to move away from reliance on a master forecasting model whose results were inevitably contentious and also almost meaningless to many key stakeholders. The standard airport planning process uses many experts from inside the organization and from universities and consulting firms to generate forecasts and 26 De Neuville, R. and Odoni, A. “Airport Systems: Planning, Design, and Management”, 2002, McGraw-Hill Professional. This book is 883 pages long, with no verbiage. It shows the immense complexity of airport planning and operations.
57
analyses. The reports are voluminous and packed with numbers. The models are “one-offs” that take a long time to build and a long time to change. Their results are presentations, a largely one-way expert-todecision maker monolog followed by question and answering sessions. The main goal for the DE studio was not to extend the scale and scope of expert modeling and use of information but to make this an interactive process of expert-plus-decision makers.
Governance: Meaningfully Involving Stakeholders A major focus in the Hartsmartin DE initiative was the most challenging: to ensure effective decision governance. This meant going beyond a “closed door” group of planners and executive team and designing the studio and suite to help bring into the process the wider communities of stakeholders whose inputs, priorities, values, opinions and influence all affect not the “decision” as a recommendation but as a commitment that can lead to effective action and implementation. The easiest way to reach a recommendation is to exclude parties. That facilitates agreement, avoids unnecessary conflict and moves matters forward more quickly. The justification is typically that the group can reach a consensus and present a case to the stakeholders that will be convincing and compelling. It can be followed up by “town meetings” where parties have their chance to express views and – often – dissent. This is the classic process for public sector decision-making, with public “hearings.” Its corporate equivalent is the presentation to the Executive Committee or Board of Directors. The DE services approach strongly challenges this tradition. A core principle is to design the decision process to help include all relevant stakeholders early and often. This is why the central element of the Hartsmartin studio is the use of real-time, visual simulation models, ones that quickly and simply display the results of projections and scenarios at the level of detail most appropriate to the moment and the participants. We emphasize this point, which is one of the core tenets for Decision Enhancement and one that differs significantly from traditional management science and strategic planning methods. These typically rely on a report that summarizes a large number of data analyses, simulation runs, and graphical displays of results, presented at a single level of detail and through a fixed “objective” viewpoint. In many instances, these master plans freeze the decision process in that it is very difficult to bring in new ideas, test scenarios and options on the fly, and let stakeholders choose the level of detail that they need to participate effectively.
58
Airport operators, for instance, need to move quickly from a policy view of a potential option, such as the cost of an extra immigration and customs checkpoint, to a detailed examination of the impact of the change on passenger movements through the airport at peak times, the average waiting times at baggage claim, and the effects on taxi traffic. That level of detail is irrelevant, irritating and meaningless for other stakeholders in the decision process, such as the airline marketing executive who wants to get a sense of what this means for average and peak flight turnaround times, or the customs and immigrant service officials who need to know the implications for staffing and their own costs. The old adage about the end of the philosophy in production and retailing of “one size fits all” and the move to mass customization has a DE equivalent: the end of “one view fits all” – one model, one level of detail, one set of information tools. This aspect of the studio and suite changes the governance options for a decision process. By ensuring that the “views” of the planning, forecasting, analysis and information evaluation of a highly complex decision problem can be (1) shared, (2) tailored to the situation and the people, (3) updated on the spot (the real-time element of the suite) and (4) viewed at multiple levels of display, the studio facilitates an inclusive approach to decision governance, instead of an exclusive one. We showed examples of the displays in our discussion of the landscape context of airport planning earlier in this chapter, using the Netherlands as illustration. Hartsmartin’s stakeholders fall into six main areas of concerns, issues and values that often conflict with each other: ¾Airport operations: cost- and efficiency-centered. The parties include air traffic control, ground operations, airport authorities, food services and shops, security units, internal transportation services, freight and cargo operators, etc. ¾Passengers: convenience-, time- and price-focused: business, leisure, corporate, and transfer passengers – forty million of them a year. ¾Environment: control-focused and largely anti-growth: local residents, public interest groups, conservation groups, competing airports, competing and feeding transport modes, such as rail. ¾Government: economic- and socially-focused: city councils, customs and immigration, regional development
59
agencies, domestic and international regulators, Ministry of Finance, Aviation, Environment and many other agencies. ¾Airlines: company- and industry-focused: hub carriers, national carriers, local services, international carriers, cargo and freight handlers, charter flights, low-cost carriers, commercial aviation, etc. revenueand ¾Investors: shareholders, bond agents.
growth-focused:
banks,
All of these strong stakeholder interests get strongly expressed. In the UK, in mid-May, 2003, the National Trust, a leading conservation organization, announced that it would mobilize its three million memberships to oppose any addition of new runways at any of the country’s southern airports. In Japan, riot police routinely had to be brought in during the many years of development of the new Tokyo airport to break up often violent demonstrations. The financial community expresses its interests by rating the bonds that fund the expansion. Airlines pull out of hubs if their interests are not taken into account.
Blueprints and Configurations Decision processes need mandates, structure, discipline and support if they are to be more than brainstorming and free-form opinions. We term these the blueprints and configurations that direct the decision process. Later in this chapter we review a famous decision-making fiasco in airport planning, the notorious Denver International Airport (DIA) case. This DIA example was built on a blueprint of a mandate of “We will build a new airport. Design and implement a plan.” This is a blueprint for problem-solving. The list of decision topics falls naturally out of it. In the Denver case, the decision mandates were to handle the financing, meet capacity forecasts, and design the airport to prespecified requirements about the number of runways, their North-South orientation, and handling of wind direction. The Hartsmartin decision mandate was one that demanded problemframing rather than problem-solving. The airport Executive Board did not pre-bound the decision space. They knew that expansion was essential but did not zero in on a single option. Instead, they instructed the planning team to identify options given a variety of criteria that would require trade-offs, many of them highly subjective and valuebased. The blueprint for the studio was built around two factors of Process and Content with six “C”s:
60
¾Content: Cost, Demand, Capacity and Constraints (environmental, social, regulatory) ¾Process: Consistency, Cooperation, and Consensus and trade-offs. Figure 3.7 below shows the “balanced trade-off” approach that constitutes the framing for the decision process. It sets a direction for problem-exploration rather than problem-solving.
Figure 3.7: Balanced Trade-off of Different Stakeholder Objectives in an Integrated Business Planning Process
Trade-offs are made by people, not computer models or planning methods. The studio is designed to bring as many people into the process as is practical. Figure 3.8 shows the “actor network” for which the studio is in effect an invitation to participate and the suite the modeling, information and display tools that make that participation easy and effective. The roles are very varied: ¾Airport management is focused on decision-finding – options and opportunities – and on the best trade-off between capacity, cost, revenues and constraints. They are the core team in that they must take the broadest view of the decision situation.
61
¾Airport planners take more of an implementation view: their mandate is to plan, design and realize. The tradeoffs they address are the balance between function, form, costs, flexibility and adaptability of facilities. ¾Investors are revenue-focused. ¾Government takes the role of approval, with its focus being on the trade-off between economic effects and social and environmental goals.
Figure 3.8: Actor Network – Airport Planning
¾Airlines pay for airport services and run their operations at them. Their priority is lower operational costs and improved opportunities for service. ¾Passengers use the airport and want streamlined handling processes and good access. ¾Air traffic control places a non-negotiable priority on safety of en-route traffic and a less strict emphasis on efficiency.
62
¾Non-aviation businesses, such as freight forwarders, package handlers, and food services and shops, depend on the airport for revenues. That is their central priority. ¾Local residents are the parties in the actor network most likely to oppose expansion and also the ones most likely to organize to do so. Their focus is on minimizing environmental impact and being reimbursed for the impacts of expansion, which literally may change their lives. A simulation model or planning document can never adequately and dynamically include the perceptions and priorities of all these parties. Nor can any mathematical formula easily incorporate their trade-off judgments. For example, the views of government and residents are very different from those of air traffic control and non-aviation businesses. The relative influence of the actors also varies widely and often unpredictably; a change in government, for instance, may lead to a substantial redefinition of the framing of the decision context. How can the decision process accommodate all these considerations? The question is rhetorical, of course, in that it rarely does. The design of the Hartsmartin suite of integrated tools ensures that no one viewpoint is automatically blocked because the model or data is set at a given level of detail and perspective.
The Studio The focus of production for the Hartsmartin studio was the use of an integrated set of models. It was driven by a set of decision agendas, topics to explore and questions to answer. The facilitator in this context coordinated the interplay between the participants and the tool suite. This required substantial understanding of the model structures and how to employ the supporting tools. The studio has to establish both the guiding framework for handling the process: the ground rules for participation, process rules to handle phasing, ensure scheduling and avoid either getting stuck in a loop or moving on too early, leaving some issue unresolved. The studio also requires careful design of the setting. The room used for this studio is housed at the TU Delft campus and – of course – is set up for high-speed telecommunications. It includes a wireless local area network to which participants connect laptops. It needs a large screen display and even the choice of table and lighting matters. Comfort, an invitational setting, ease of use and encouragement of communication are all key variables here. Again, the analogy to a TV studio is a direct one.
63
This studio was designed around “what if?” evaluation of scenarios. This may be thought of as its script. This was tight enough to ensure discipline and relevance and loose enough to enable exploration and creativity. Figure 3.9 below shows the organizing theme, which is the airport’s annual report. There were several reasons for choosing this. First, it is the best way to incorporate all the revenue, cost, and operational performance requirements in the search for a “balanced” trade-off among all the network actors’ values and priorities. It literally frames the discussion, providing a common reference point. The scenarios explored within the strategic agenda cover all the values, issues and stakeholders. The choice of the annual report for year 2007 as the common reference point gives the analyses a shared base for comparison.
Figure 3.9: ‘What If’ Analysis with Different Scenarios to Estimate the Efficiency/Profitablity of Decisions
The route to making such comparisons and interpretations is via visualization. For the Hartsmartin studio, the choice of visual displays to support visual thinking was simple: layered animation views of the airport supported by analytic modeling, planning and simulation tools. For any scenario, the decision-maker participants can see the airport as it will be. At the strategic planning level, the view is of the airport in its geographic context of roads, residents, traffic flows and transportation systems. The “what if?” analysis here shows the aggregate impacts of a policy choice. The second level of analysis is at the tactical level and sharpens the focus on the airport structure itself; it provides a
64
representation of what the airport would be if this scenario were to define the master plan. The third view is far more detailed and at the project planning level; it shows the multiple subsystems of the terminals: gates, check-in facilities and immigration and customs points, for instance. The lowest view is at the operational/logistic planning level, and provides the detail needed for process evaluation and design. Figure 3.10 summarizes the layers and their role within the decision process – rough estimation, analysis, and simulation/animation.
Figure 3.10: Tool Usage at Different Airport Planning Levels
The ability to move smoothly and quickly between these layers is core to involving stakeholders. It is architecture. The word “architecture” is somewhat loosely used in the information technology field. We employ it here in its original sense of the design and construction of buildings: a set of layered frameworks for moving from form to function. The top level addresses function; the architect works with the client to get an idea of the appearance of the house, how it fits into the surrounding land, its scale, etc. This process is highly visual, relying on sketches and, in the era of CAD/CAM tools, three-dimensional representations that can be rotated, enlarged and quickly modified. At some stage in the client/architecture collaboration, the process shifts from function to form, with ever-more detailed blueprints and diagrams. The builder cannot work from the top-level sketches. The client cannot make much sense of the plans that show electrical circuit connections
65
and wall supports. Similarly, with the DE visual architecture, there is a layer of view relevant to each stakeholder group. The government and investors want to see the aggregate, top-level view; if we change this and add this, what is the impact on noise and traffic congestion? By contrast, the airport planners, like house builders, must see this aggregate approximation translated into details. In general, simulation models and information systems do not layer their views dynamically and in real-time. A key principle in the development of the DE suite was to separate the simulation from its displays. This means that the same scenario can be viewed on demand at any – and all – levels of aggregation. If we think of visualization as a form of abstraction, then this is essential. Forcing a type of display on decision-makers, as many models do, imposes on their thinking processes. For the airport project, the visualization was supported by many other tools, with each appropriate to the layer. These included access to ERP systems, databases and legacy systems and links to spreadsheet tools.
The suite The suite of IT tools must correspond to the key individual business processes of the airport. It must also be an integrated set of models that links the processes. So, for example, flights affect real estate, which affects revenues, which affects investments. Equally, market scenarios affect desired flight capacity, which affects actual flights. There is no master model or system that captures all the factors and business processes. Instead, the Hartsmartin suite provides four sets of models linked to a common database: model set: market ¾Demand passengers, desired schedule
scenarios,
freight,
¾Capacity models: “airside”, “landside” and “noise” capacity scenarios, regulatory constraints and impacts, capacity/quality ¾Policy models to match supply and demand ¾Financial models: revenue, cost, investment The requirements for choice of tools for the suite were very demanding and it is only within the past few years that the technology architectures needed to build and link them have been available. Here are the essential needs, with brief comments added:
66
¾The models must be integrated and work on a consistent database, with communication between toolkit components and decision-makers via welldefined interfaces. Historically, decision support tends to have been built on stand-alone systems and tools that do not directly, automatically and simply link to each other. The situational and often ad hoc nature of “what if?” analyses and planning often requires access to multiple sources of data. Integration across data structures has until recently been the single most complex problem of information technology management, especially when much of that data is in old, disparate and incompatible legacy systems. ¾The models and information services must be accessed through a common, interactive, intuitive and sophisticated interface. To the user, the interface is the system. Multimedia visual interface design is one of the most promising areas of development in the information technology field and, alas, the quality of most IT systems interfaces is well below that of the typically mobile phone, computer game and digital camera. Far too many websites are poorly designed, in terms of ease of use, freedom from clutter on the screen and navigation. ¾All the tools must be transparent to the non-technical user. They should need to know only as much as they do about electricity in using electrical tools and be able to naturally and simply handle inputs, processing and outputs. They should not be suddenly made aware of some technical elements of the design of the suite which make them have to adapt to the system instead of the other way round. Decision Enhancement services are discretionary to decision-makers and studios are invitational environments where they determine the level of collaboration and commitment they will make to the process. They tune out and often go out of the studio when non-transparency reminds them that this is a “computer” and “technology.” ¾The range of specific tools should be consistent with the layering of the levels of planning, analysis and detail described earlier in this chapter. No one tool can meet the very varied needs, styles and interests of a very varied network of decision actors. An effective
67
studio and suite needs to be able to draw on analytic tools, simulation tools, CAD (computer-aided design) tools, spreadsheets, graphs, animation, video, database management, etc. These must be integrated; the first requirement on our list is must be accessed through an appropriate interface, and must be transparent. The suite that meets these requirements combines usefulness (analytic tools, simulation models, information search and analysis services), usability (interface design, transparency) and usage (the studio environment, facilitation).
Lessons from the Hartsmartin DES initiative The Hartsmartin project is ongoing. It is ambitious and large in scope. It poses complex technical issues. Most of those are resolvable through architecture, building blocks and the development team’s skills. What is far more difficult to address are issues of decision process. Some stakeholders remain on the sidelines. Many bad organizational habits that lead to poor processes – and poor likely outcomes – remain. There is the constant problem of validation – technical validation of the model itself and of the reliability of the data it uses plus psychological validation. This term, introduced in the very early stages of the DSS movement27 refers to trust. Coming back to the points we made in Chapter 1 about how little interactive IT capabilities are used in decisions that matter, much of this reflects decision-makers’ discomfort in letting go and accepting both the suite and its models and the studio and its processes. That trust is emerging as all elements of the DE service come together – as they must: the quality, flexibility, responsiveness and communication of the suite plus the facilitative methods of the studio plus depth of understanding of the decision domain. It is this last element that is the critical one now: understanding effective decision-making. That is the topic of our next chapter. Before we get to it, we must admit that most of the management literature and case examples that we have reviewed in our studies suggest that decision-making in organizations for decisions that matter is marked more by frequency of failure than of success. To some degree, this may be because screw-ups are interesting to report on and draw lessons from. They are eventful, with crises, blatant mistakes and “personality” 27
Keen, P.G.W. and Scott-Morton, M.S. “Decision Organizational Perspective”, Pearson Addison Wesley, 1978.
Support
Systems:
An
68
whereas effective decision-making is almost boring: sensible people do sensible things and it all works out. Can you recall a single recent newsworthy story about big screwups in Dell, Wal-Mart or Southwest Airlines, for example? Their decision disciplines keep them well on track – and boring28. All in all, the evidence is that decision-making is very difficult, very error-prone and very inefficient in most organizations. That makes effective decision processes a source of sustainable competitive edge. The main contribution that the Hartsmartin studio is making to airport development decision-making is to shift the nature of the process, adding key elements of agility beyond speed: a new flexibility, a high degree of coordination, opportunities for collaboration and innovation in thinking about the decision arena and how to best achieve results. The facilitative components of this enhancement could be provided through many standard Group DSS environments and methods. What the suite capabilities add is well beyond the reach of most such studios: the comprehensive, integrated real-time technology base for meeting the information, analysis and communication needs of any group of actors, with zero latency. By itself, the simulation models would be just another, albeit powerful, “system.” By itself, the studio would be just another change management exercise, albeit skillfully facilitated. Without the focus on landscaping and governance, both would ignore and omit stakeholders and exclude many key issues in this multi-actor decision arena.
28
That said, even a Wal-Mart may suddenly need to mobilize to address new decisions that matter. Its labor practices have come under attack, with a discrimination class action suit being brought on behalf of female staff in mid-2004. It has taken many new decisions as a result.
69
PART II: DECISION-MAKING Chapter 4
What We Really Know About Effective Decision-Making In this chapter of Decision Enhancement Services, we step back from application to concept. We ask and give our own answers to four core questions that need academic guidance but that are not at all academic in their implications for practice: ¾What is a decision? ¾What are the main perspectives on effective decisionmaking? ¾Why do so many decisions fail? ¾Are there reliable guidelines for building effective decision processes? Before discussing Decision Enhancement services in any detail, it is obviously essential to pin down just what a decision is. Otherwise, it is meaningless to talk about their “improving” decision-making. That turns out to be less simple than commonsense might suggest. One of the main surprises that we, the authors of Decision Enhancement Services, veterans of the DSS field, consultants to senior managers and university professors, have encountered in the work that led to this book is just how little discussion of decision-making there is in both the fields of business strategy, decision technologies and organizational change. To a large degree, decision-making is viewed on an axiomatic basis, with two clear extremes along a spectrum of descriptive perspectives and prescriptive ones – the “is” and “ought to be” lenses. Axioms begin with “Well, of course.” The descriptive axiom is: “Well of course decisionmaking is a process of strategic planning followed by execution”; it is social in nature. The prescriptive one is: “Well of course decision-making is the search for optimal solutions to a problem situation”; it is a rational
70
process. One party’s axioms – articles of faith – are another’s heresies, and the people-computer, organizational-analytic, political-rational distinctions will continue to divide views of just how decisions are made or should be made. Perhaps they should continue to do so. Descriptive realism plus prescriptive discipline can make for a powerful combination. That is our own focus. We see far too many strongly-held positions that dismiss the opportunity: the attacks in the social sciences on the computer “mentality” and the neat but too often unrealistic logical schemes in the analytic professions. DE tries to provide a fusion of these, with studios being designed mainly around the descriptive side of decision-making axioms – “this is how decisions really get made, thus support the process” – and suites around the prescriptive one – “there really is a better way of handling this, so let’s try and enhance the process”. To meet in the middle, studios have to address the issue of analytic and intellectual rigor that moves the facilitative meetings between “warm and fuzzy” participation that leads to inconclusive outcomes to a more systematic process. Equally, as we showed in the Hartsmartin case example, suites have to mesh into the individual, group, organizational and stakeholder dynamics of the process. In both instances, the move is via process. One of our main conclusions in reviewing a wide range of theoretical, conceptual, and applied work in a range of contexts is simple: effective decision-making comes from the design of processes not the search for “the” solution. To achieve any synthesis between the descriptive and prescriptive conceptions of decision processes, it is obviously important to understand both viewpoints. We do not see much of such an understanding in the professional fields that all must contribute to DES for it to be effective in its impacts. For this reason, we step back in this chapter and provide an overview of what we really know about decisionmaking. Our main aim is to stimulate thinking and discussion. We begin our effort to provide practical and useful answers to our core question of “what we really know about decision-making”, by showing the impact of process on decision outcomes. We do so with a cautionary tale of a famous decision disaster in airport planning that contrasts in just about every way with the approach we described in the Hartsmartin case. The story of the building of the Denver International Airport is well-documented and provides a direct and useful contrast to the DE studio approach. In the brief summary, we want to draw attention to the fact that the disasters were not caused by stupidity, lack of effort or laziness. Many very capable and experienced individuals and organizations worked on the venture. The failure was one of the decision processes.
71
The Denver Disaster The decision made in the mid-1980s to build a new airport at Denver was prompted by very much the same issues as at Hartsmartin: deregulation, competition, and capacity constraints at Stapleton, the existing hub. Stapleton was the fifth busiest U.S. airport at the time and ranked in 1988 as one of the ten worst bottlenecks in the air transportation system. Harold Kerzner29 provides a succinct case summary of the DIA project. He begins his analysis with the question: “How does one convert a $1.2 billion project into a $5 billion project? It’s easy. Just build a new airport in Denver.” Here is the chronology, with our own short comments added. In 1985 Mayor Federico Pena and city officials agreed to build a replacement to Stapleton airport, Denver International Airport (DIA), at a projected cost of $1.2 billion. The decision was closely identified with Pena’s leadership and advocacy. A leading consulting firm produced a feasibility study in 1986 that included traffic forecasts. United and Continental publicly announced their objection to the idea of building a new airport, fearing the added cost burden. Opponents to the proposal contended that expansion of Front Range Airport, located just 2.5 miles southeast of DIA, would be a far better choice. It was clear that Front Range would be a direct competitor to DIA. In 1987 Denver voters passed a referendum approving the new airport at a cost of $1.7 billion, with Pena’s reputation and commitment a key factor in this initial “success.” Just about every rule of governance that we suggest for DES was ignored here. A small group pre-decided the initiative; there was no search for additional options, outside parties’ interests were discounted, and a standard consulting study – and single forecast – established the boundaries for the design and implementation of DIA. This command and control approach meant that very basic issues key to the effectiveness of the decision had not been addressed. Those issues moved to the fore when the decision process moved on to select an architect for the design of DIA. Goal conflicts that were previously hidden came immediately into play. Airport personnel looked for an “easy-to-clean” airport and persuaded the city planners to hire a New Orleans firm. The city wanted “a thing of beauty.” The New Orleans firm complained that the directions given to them were in conflict – 29
Harold Kerzner, Project Management: A Systems Scheduling, and Controlling, Wiley, 2001, pp. 638-670
Approach
to
Planning,
72
operational versus aesthetic issues. The design was, however, approved by the mayor and city council on the recommendation of a blue-ribbon commission. Financial analysis showed that the roof design proposed by the firm would push the project off schedule and add costs of $48 million. A second architecture firm was brought in with responsibility to take the venture through to completion. Meanwhile, the effectiveness of the entire project team was being questioned. It had failed to sort out the differences between the maintenance orientation of the airport operators and the city’s aspirations. Nor did it detect cost and compatibility issues in the first design “even though the PMT (Project Management Team) had vast in-house expertise.”30 The burden of decision responsibility was falling on the architects. Throughout the design battle, no one had heard much from the airlines. United and Continental controlled 80 percent of flights, in and out of Stapleton. They refused to participate in the process, probably hoping that the project would be cancelled. The city ordered the design team to go forward with bids without any formal input from the carriers. To entice the airlines, the city then agreed to a “stunning” range of major changes, including moving the gates for international flights from the main terminal to Terminal A and building an unplanned bridge between them. United demanded a new baggage handling system and an expansion of the width of the concourse to permit two moving walkways. Neither Continental nor United signed a firm agreement with the city, which meant that the interest expense on the bonds to pay for DIA was higher than anticipated, as rating agencies added a risk premium for a project that was now way behind schedule. “From a project management perspective, there was no question that disaster was on the horizon.” No one knew what to do about the baggage-handling system, which is now legendary in the information technology Fiasco Hall of Fame. “No one realized the complexity of the system, especially the software requirements…… The contract for DCV [the baggage-handling system] hadn’t been awarded yet, and terminal construction was already under way.” Car rental companies got into the act, objecting to the fees for their terminal location and demanding and obtaining a separate campus. That meant that passengers would now have to take a ride on a shuttle bus. Construction began in 1991, still with no signed contract with United and Continental. Standard and Poor’s lowered the rating on the bonds to BBB-, just above junk status. Continental filed for bankruptcy. United demanded $200 million from the state in order to keep its hub in Denver
30
De Neuville, R. and Odoni, A. “Airport Systems: Planning, Design, and Management”, 2002, McGraw-Hill Professional, page 647
73
and to build a maintenance facility (which in the end it located in Indianapolis.) Hotel developers put new construction on hold; DIA was 26 miles outside the city. UPS and Fedex started a “dog fight” with the airport planners over their move to locate their operations at Front Range, where the cost of renting was one-third that for DIA. BAE, the software arm of one of the world’s leading aerospace firms, was contracted in 1992 to design and build the baggage-handling system, agreeing to complete what had been estimated as eight years of work in just two years, in order to meet the October 1993 opening date. Things got worse. Much worse! There were growing personnel stresses and complaints, software problems with DCV, and another lowering of the bond rating by S&P to “non-investment grade.” A merchant’s association was formed by concessionaires to fight for financial compensation for the lost revenues that the four-time delayed opening had cost food operators and shops. Federal grand juries began investigations into faulty workmanship and accusations of falsified records at DIA. BAE and the city sued each other, as the baggagehandling non-system became national news. When the airport finally opened, in early 1995, airline costs per passenger were estimated to be three times that for the old Stapleton. The original $1.2 billion investment had ballooned to over $5 billion – and many parts of the airport were already falling apart, with rain seeping in through the ceiling. The DIA decision process was a fiasco. The Hartsmartin process will not be a fiasco. This is not in any way because it uses simulation models that explore scenarios – that is the suite of tools, not the studio process. The DIA project management team, airport planners and the consultants who were brought in all made use of simulations and information systems. The difference is in how, where and why simulation modeling is used to improve the decision process. The expert and command and control approach largely uses them to model a “problem” and reach a “conclusion.” The DES perspective is to explore the opportunity space and facilitate a shared understanding that leads to collaboration that leads to a conclusion. With the DIA project, the New Orleans architecture firm took over responsibility without any of this having happened. There were surprises everywhere. The airlines played no role in the process until far too late. Paul Nutt, who cites the DIA case as a gigantic disaster in his book Why Decisions Fail, highlights the critical importance of encouraging active and meaningful stakeholder involvement right at the start, rather than achieving apparent early consensus and only later attempting to convince and co-opt potential naysayers.
74
This is not that easy to achieve. In our own work at Hartsmartin and several other airports, we found a natural tendency for a small group to take over the studio and for others to waver in their participation. For example, the public is as yet weakly represented. In some instances, key parties avoid entering into a collaboration that may lessen their own bargaining power or signal a commitment to the decision that they are unwilling to make. In the DIA instance, United and Continental were kept outside or chose to keep themselves outside the planning and design. The most obvious and far-reaching failure was to exclude key stakeholders. This led to a limited search for options and information, with no modeling for insight and understanding. It generated an “edict” approach to decision-making that at best leads to indifference at and at worst later stirs up many social and political forces. The emphasis on expert planning turns the process into a persuasion approach that has little effect on people with something to lose. When smart people produce bad results, change the process.
Bridging the Gap: Prescription and Description People have opinions; spreadsheets don’t. People want to make sure their opinions and ideas are included in the process at all times. They limit their use of information, for good and bad reasons. They will not easily subordinate their judgment to normative models and methods. They have partial knowledge and need collaboration with others who have additional knowledge. They are very sensitive to the time demands of decision-making. They have personal interests. They operate in a “political” sphere. And so on. The good reasons for doing this are their confidence in their judgment. The bad reasons are their confidence in their judgment. There are many experts in organizational decision-making who argue that these process realities largely inhibit the value of analytic methods and IT-based tools. Here is one such view, from a very influential social scientist, Thomas Sowell, whose book title, Knowledge and Decisions, would on the surface appear to endorse the usefulness of informationbased methods. Instead, he basically dismisses the entire analytic tradition: “Highly rational intellectual “models” of human behavior suffer from an air of unreality whenever these hypothetical, computer-like incremental adjustments by coolly calculating decision-makers are compared to the flesh-and-blood reality of decision by inertia, whim, panic, or rule of thumb. In reality, rational principles themselves
75
suggest a limit to how much rational calculation to invest in.”31 (Our emphasis is added here.) Russo and Schoemaker, the two authors of Winning Decisions, provide a counter to Sowell’s viewpoint and his focus on personality, intuition, brainstorming, and interpersonal trust in decision-making. They argue for a “cognitive’ view. What makes their views highly relevant to DES is that their work is based on the Shell scenario planning studios that were in many ways the first sustained and large-scale decision-making studios used in handling Shell’s decisions that matter.32 They point to the many studies that highlight the limitations of human informationprocessing: intuition, bias, “recency” (overweighing some information on the basis of experience and familiarity), overconfidence, and the welldocumented weaknesses of humans in probabilistic reasoning. (Studies consistently show that experts in such fields as weather forecasting and probation assessments perform no better than tossing a coin to make a pick33). They comment that when intuition works, it is great but that “to those that study decision-making, the most striking feature of intuitive judgment is not its occasional brilliance but its rampant mediocrity.” An air of unreality and rampant mediocrity amount to a bleak prospect for improving organizational effectiveness. Decision Enhancement must bridge the two extremes of description and prescription. The starting point is to decide first what a “decision” is.
What Exactly are “Decisions” and “Decision Making”? What exactly is effective decision-making? One of the most difficult aspects in pinning down an answer to this question is the necessary distinction between a successful decision process and a successful outcome. At most, process and outcome are correlated, not directly causal, in that the more effective the process the more likely it is that the resulting decision will produce success, but there are many instances of a terrible process resulting in a lucky outcome, or a very skilled process producing a well-thought out decision that fails for any number
31
Sowell, T. “Knowledge and Decisions”, Basic Books, 1996, page 100. Schoemaker, P.J.H., and Russo, J.E. “Winning Decisions: Getting it right the first time”, Doubleday, 2002, p. 116. 33 A simple example of how weak human judgment can be in making estimates is the following test: What is the approximate result of the calculation 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 x 9 x 10 x 11 x 12. The typical answer is around 4,000. Shown the figures in the reverse order – 12 x 11 x 10 and so on – the estimate is closer to 40,000. The real answer is almost 50 million. 32
76
of reasons – some unanticipated change in the environment, ineffectual implementation, or a competitor’s moves, for example. Given that ‘decisions that matter’ are, by definition, dominated by uncertainty, complexity, risk, and incomplete information, there will inevitably be many failures in outcome. Consider, for instance, the U.S. Federal Reserve Bank’s decisions on setting interest rates or a European government’s budget decisions. These are recurrent, draw on massive amounts of information, forecasts, analysis and expert opinion, and are clearly decisions that matter for the entire economy. Only the clichéd Monday Morning Quarterback is in a position to judge their effective results and then often well after the event. At the time the decision is made, no one knows the outcome and many people will disagree with it. From the perspective of Decision Enhancement, the starting point for effectiveness is to view decision-making as a process and to move new generation decision-making and new generation technology to new generation decision processes. A key step here is to distinguish between decisions and many related but different outputs from planning, collaboration and communication. Decision-making is not any of the following; although each of them may be a component of effective decision processes, they do not in and of themselves achieve the end point – a decision choice and commitment that can be implemented: ¾Not a strategy: strategy sets up the governance and directional frameworks for decision-making in regards to priorities, markets, products, organization and finance and highlights decision domain targets for action but it is not in itself a decision or set of decisions. By “domain” we mean a functional area such as marketing or human resources or a specific area of business concern, such as response to new legislation, capital investment planning, or launching a new change management initiative. Much of the organizational inertia summarized as “analysisparalysis” reflects the difference between strategy and decision-making. A cynical colleague once explained to one of the authors of Decision Enhancement Services why he refused to add implementation work to his firm’s strategic consulting services: “Strategies never fail. Execution does. So we never fail.” Strategy is the frontend of the decision process, not a complete decision process. The hard work of choice, commitment, and execution follows on from strategy. Effective decisionmaking sets up effective execution.
77
¾Not a consensual agreement: Such agreements are effective only if they are a commitment to action. Otherwise, they are merely the equivalent of the “frank meeting of minds” and protocol mouthed by politicians on the White House steps or outside Parliament. They sound good – sometimes very good. They are often the wellformulated outputs from task forces that in essence state that “Here is our joint view on the best path to take.” They rarely answer the obvious next question: “So who does what, when and how to turn the consensus into action?” ¾Not a vision: Visioning is a central element of leadership: “The dream or vision is the force that invents the future.”34 But visions are emphatically not decisions: they are direction-setting and establish a context for ambitious and innovative decision-making. They motivate decisions, enable decisions and shift priorities for decisions. But they are not decisions. Visions that do not facilitate these movements to commitment and action are fog and fantasy. ¾Not a forecast or scenario: Decision-making addresses the future and decisions that matter must deal with a highly uncertain future. In that context, scenarios of the future play an important role in orienting the decision process and in establishing assumptions and constraints on options. Indeed, a core element of decision support has always been “what if?” analyses – What if we cut price by 5% and increase advertising by $1.2 million? What sales increase would we need to profit from these actions? One of the main reasons the spreadsheet has become such a routine tool in financial planning is that it changed the marginal economics of effort by reducing the additional time, money, information-gathering and analysis in trying out new scenarios. This encouraged exploration and creativity that was blocked by the previous time and workload involved. Scenarios, simulation and what-if tests equally improve the economics of effort and are integral to rehearsing the future but they are not in any way
34
Kouzos, J.M. and Posner, B.Z. The Leadership Challenge, 1995, A. Wiley Company.
78
decisions. Decisions come from the use of the scenarios in the decision process.35 ¾Not a recommendation: Recommendations and “optimal” solutions are an outcome of the analytic side of decision processes but as the literature on leadership and change management makes very clear, the informationgathering, analysis and collaboration that leads to a recommendation is not enough. They must be complemented by mobilization for action. A fairly notorious syndrome in large organizations and government agencies is the use of planning groups and consulting teams to come up recommendations that are in effect a pass the buck or CYB – cover your backside – tactic. The recommendations are noted, passed on to senior management or in some instances filed away and ignored. What should have been a significant contribution to making a decision is then wasted because the project was used to avoid a decision or as political ammunition. What is missing from all these perspectives is the link between thinking things up and getting things done. Our own conception of decisionmaking is that it is a process that leads to a commitment to action. Commitment is key; we talk about making a decision, coming to a decision, being decisive and the like. Positioning for action is equally key. The challenge for strategists, leaders and teams is to mobilize the organization, gain support, reach agreements, make trade-offs and negotiate so that the decision can be implemented. A “good” decision process has two distinctive attributes: the analytic elements of information-gathering, modeling, reporting and evaluation, all of which benefit from formal methods and tools, and the social elements of collaboration, leadership – surely essential to decision-making for innovation – consensus-building, communication and mobilization. Decision-making is a process targeted to commitment and to enabling follow-on action. DE services aim at helping the move to commitment. This presupposes several conditions that do not always apply in decision situations and we need to be explicit about them. They are (1) that key stakeholders are open to collaboration and willing to share their
35 Lessons from Shell, the best-known innovator in scenarios. Over the past decade, the global scenarioanalysis community has begun to combine the primarilyqualitative and narrative-based scenario analysesundertaken by Royal Dutch/Shell and other companies. Sources: Schwartz, P., 1992: The Art of the Long View. London, Century Business; Wack, P., 1985a: Scenarios: uncharted waters ahead. Harvard Business Review, 5 (Sept./Oct.), 72–89; Wack, P., 1985b: Scenarios: shooting the rapids. Harvard Business Review, 6 (Nov./Dec.), 139–150
79
information and be candid about their views and values, (2) that there is at least a broad agreement on both the goals of the decision process and some measure of an effective outcome and (3) that the results of the process will be actionable in that there are available resources and management authority to follow up on the decision. In most areas of organizational decision-making, these are reasonably realistic assumptions. They are less so in intensely “political” environments. These are the ones that Thomas Sowell works in, so that his cautions about the “unreality” of highly analytic intellectual models are worth keeping in mind. A key to effective Decision Enhancement is to first establish an environment where the three requirements we listed above apply: readiness of stakeholders to collaborate, clarity of overall goals and actionability.
Perspectives on Decision-making – or Lack of Perspective Many highly influential modern business books ignore the subject of decision-making entirely and focus on thinking things up, rather than getting them done. For example, commonsense says that for “strategy” to be effective, it requires decisions – decisions that very much matter when they involve innovation, shifts in competitive positioning, major capital investments, and organizational change. Yet, neither decisions nor decision processes nor decision-making are even in the indexes of the most influential books on business strategy (including Michael Porter’s and Gary Hamel’s).36 A leading textbook on “Economics, Organization and Management” first mentions decision-making two hundred pages into the analysis and, again, does not state what a decision actually is. The number of references to decisions, decision-making and decision processes in leading books with titles that very much point to the need for action, commitment and decisiveness is typically close to zero and in many instances zero: examples that could be multiplied by hundreds are: The Wisdom of Teams: Creating the High-Performance Organization, The Nature of the Firm, and Competing For The Future: Breakthrough Strategies for Seizing Control of Your Industry and Creating the Markets of Tomorrow. The same is true – rather surprisingly – in the many recent books on leadership, which now constitute by far the largest single subject on any 36
Porter, M.E. “Competitive Advantage”, 2004, Free Press; Hamel, G. and Prahalad, C.K. “Competing for the Future”, 1996, Harvard Business School Press.
80
bookstore’s business shelves: a complete absence of references to the topic in The Leadership Engine: How Winning Corporations Build Leaders at Every Level, for instance.37 A compendium of extracts from books and articles on leadership that includes just about every major writer on the topic has just one discussion and that is about decision-making as a “defining moment” and character building for leaders. The Decision Enhancement lens extends these perspectives on organization effectiveness and no way contradicts, competes with or substitutes for them. It simply suggests that decision commitment is integral to what they recommend and needs to be brought into sharper focus. This absence of formal attention to decision-making in business research and commentary is both recent and puzzling. It stands in marked contrast from the management literature of previous decades, where decision-making was at the very center of focus. Management thought was very much driven by what may be termed the “executive decision” model, as exemplified by the work of Chester Barnard, whose 1938 book The Function of the Executive underlies what was for decades the most influential conception of management and organization.38 Herbert Simon and Peter Drucker, who strongly impacted both theory and practice, built on Barnard’s perspective and almost axiomatically see decision making as the very foundation of management. Here are a few quotes from Barnard, Simon and Drucker that add up to the management mainstream from the 1960 through the 1980s: “The fine art of executive decision-making..… The act of decision is part of the organization itself.” (Barnard, page 77) “Administrative Behavior was written [1947] on the assumption that decision-making processes hold the key to understanding organizations….. If any “theory” is involved, it is that decision-making is the heart of administration, and that the vocabulary of administrative theory must be derived from the logic and psychology of human choice.” (Simon, new edition, 1997, pp. ix and xi)39
37 Tichy, N.M., and Cohen, E.B. “The Leadership Engine: How Winning Companies Build Leaders at Every Level”, HarperCollins Publishers, 1st ed. 1997. 38 The preface of a book on a symposium of leading economists that celebrated the fiftieth anniversary of Barnard’s work speaks of it as “timeless” and a core of modern organizational theory. (Oliver Williamson (editor), Organization Theory: From Chester Barnard to the Present and Beyond, Oxford University Press, 1995). 39 It is noteworthy that the very first page of Administrative Behavior points to a “general neglect of decision-making.”
81
“The central theme around which the analysis has been developed is that organizational behavior is a complex network of decisional processes…. The anatomy of the organization is to be found in the distribution and allocation of decision-making functions. The physiology of the organization is to be found in the processes whereby the organization influences the decisions of each of its members – supplying these decisions with their premises.” (Simon, page 305) “Rational Decision-Making in Business Organizations” (the title of Simon’s Nobel Memorial Lecture, 1978) “Decision-making is only one of the tasks of an executive…… But to make the important decisions is the specific executive task. Only an executive makes such decisions. An effective executive makes these decisions as a systematic process with clearly defined elements and in a distinct sequence of steps. Indeed, to be expected (by virtue of position or knowledge) to make decisions that have significant and positive impact on the entire organization, its performance, and its results characterize the effective executive.” Drucker, On the Profession of Management, page 32, 1998) One final quote comes from one of the most influential books on organization, A Behavioral Theory of the Firm, written by two of Simon’s colleagues with whom he co-authored a number of other books: “We start with the simple conception that an organizational decision is the execution of a choice made in terms of objectives from among a set of alternatives on the basis of available information. This leads to the examination of how organizational objectives are formed, how strategies are evolved, and how decisions are reached within those strategies.” (Cyert and March, page 19). A strength of the Barnard-Simon-Drucker schools of thought was that they stressed management choice. A decision was a real decision. The decision-as-choice conception remains the commonsense one in many fields outside business. We routinely talk about medical, political and legal decisions in terms that highlight the characteristics of what we term Decisions That Matter – “tough” decisions, dilemmas, ”being on the spot”, “the buck stops here”, and the like. Simon and his colleagues, collectively known as the Carnegie School, emphasize the search for
82
appropriate rationality in decision-making: clear objectives, decision structures, information, and discipline. This demands knowledge and application of not so much computers as mathematics. There is a wealth of math-based, computer-enabled models and methods for Decision Enhancement to draw on here in building on the logic of decision-making as a rational choice; these include analytic models, optimization, multi-criteria decision-making, statistics, econometrics, forecasting, expert systems (an area where Simon and his colleagues made many early major breakthroughs), game theory and probability theory. The title of Barnard’s book, which remains one of the most influential in the business literature, captures its core logic: ”The Function of the Executive” (our emphasis added here). Why did the executive decision model lose influence? Why is leadership – with no clear decision focus – so much the new mainstream of the literature on business research and practice? The explanation could, of course, be that it is such a routine and obvious activity that there is no need to define it except in fundamental theoretical research; people use the word “organization” very much in this way. But the oversight easily leads to a discounting of the nature of decision-making. Here is an instance from a first-rate book with the subtitle “A Seven-Point Strategy for Transforming Organizations”, which devotes just four pages to a discussion of decision-making, with one comment on the role of participation: “One of the basic assumptions about participation is that better decisions will be reached if those who have knowledge about the issue being considered take part in the decision-making process.”40 What are “better decisions” in this context? An agreement, a recommendation, a strategy, a plan, a consensus? If the outcome of a decision made in conditions of crisis, uncertainty and stress but with superb participation turns out to have poor results, was that a bad decision? What if the pressures to participate lead to avoidance of conflict and unwillingness to argue against the consensus, so that the decision is really the lowest common denominator of agreement? These questions are not in any way intended to dismiss the importance of participation. Indeed, Decision Enhancement shares the same basic assumptions as the book quoted above and the studios it provides are explicitly collaborative. But the nagging question remains: what is a decision?
40
Nevis, E.C., Lancourt, J. and Vassallo, H.C. “Intentional Revolutions: A sevenpoint strategy for transforming organizations, Jossey-Bass, 1996, page 107.
83
The participation view sees it mainly as an agreement. Interestingly, a well-documented counter to the claim that participation leads to better decisions is the Bay of Pigs fiasco in the early days of the Kennedy Administration. Janis’ analysis highlights “Groupthink” as the explanation for the literally disastrous decision arrived at in the highly participative process.41 The decision-as-participation model is at best incomplete. In the analytic fields of management science – from which traditional DSS evolved – economics, financial theory, and areas of computer science that focus on simulation modeling – the conception is very different and in many instances pushes away from participation towards detached expert analysis: it almost axiomatically assumes that a decision is basically an intellectual solution to a problem. It is the best choice that a group of skilled people backed up with powerful analytic tools, computer systems and plenty of information can produce. That makes it a recommendation or proposal. Here there is a detachment between recommendation and action. At its weakest, the result is a report from the best and the brightest that the real decision makers skim through the Executive Summary and file the report away in a drawer. The analytic tradition tends to ignore the social context of decisionmaking. The social sciences, by contrast, emphasize the context and process of decision-making, often highlighting as a virtue the very opposites of the analytic perspective: the importance of compromise, value trade-offs, mobilization of support even at the expense of analytic rigor, and the relative subordination of information to process. One influential book on decision-making in the political and social arenas that we quoted from earlier in this chapter, Knowledge and Decisions, claims that “Among the ways various decision-making processes differ is in the extent to which they are institutionally capable of making incremental trade-offs, rather than attempting “categorical” solutions.”42 He summarizes this as the difference between a car buyer’s and an appellate court’s decision process. He sees effective decision making as resting on “crucial” feedback mechanisms in “a world where no individual or manageably-sized group has sufficient knowledge to be right the first time.” He downplays the intellectual approach to decisionmaking: “Ideas are everywhere but knowledge is rare… Everyone is ignorant, only on different subjects” and “The knowledge-transmitting capacity of social processes and institutions is not how much information but how it is used.”
41 Janis, I.L. “Groupthink: Psychological Studies of Policy Decisions and Fiascoes”, 1972, Houghton Mifflin Co. 42 Sowell, op cit, page xii
84
We have highlighted the merits of the rational choice/executive decision model because it should remain a central element of Decision Enhancement. Those decades of efforts to embed its “academic” methods, tools, models and disciplines in the decision processes of managers in decisions-that-matter have produced relatively little results should if anything encourage broader efforts to extend their usefulness through more usable tools and their use through more attention to both the analytic and social components of decision processes. An encouraging signal here is the history of the spreadsheet. In the late 1970s, what is now a mundane piece of PC software, with Microsoft’s Excel the most widely used core of larger suites (access to data bases, links to presentation tools, etc.), was an expensive and specialized area of financial modeling. It took real effort and many years to convince business managers that it was worth their while to trust these new tools and to make hands-on use of computers. Now, most managers literally could not do their jobs without them. The useful became usable and is used. One of the core goals of DE is to similarly help embed real-time simulation modeling in everyday management decision-making. The simulation software and mathematical formulae, and technology base of the airport studio and suite presented in Chapter 3 of Decision Enhancement Services illustrate both its value and how to make it more usable. All the tools used in the airport studio have been widely available with many individual technical implementations, research papers and “proof of concept” applications. What distinguishes their Decision Enhancement extension is (1) their assembly into a suite that adds interactive and visualization capabilities, and (2) the facilitative, collaborative studio. Our overall view of simulation and modeling parallels that of Donnella Meadows, a leader in the field of using modeling tools to influence the public debate on areas of environmental protection, energy policy and economic development: “Computer models do not make the job of decisionmaking any easier. They make it harder. They enforce rigorous thinking and expose fallacies in mental models we have always been proud of. We think it is worth it. We think it pushes our mental models to be a bit closer to reflecting the world as it is.”43
43
(Donella Meadows, Groping in the Dark: the First Decade of Global Modeling, quoted in www.robbert.ca/gs/quotes.html, March 2003.)
85
Decision Enhancement adds to Meadow’s statement that models do not make the job of decision-making any easier but harder: therefore make sure that they are embedded in studios that do make it easier.
Why Decisions Fail One of the most comprehensive studies of decision-making in action is Paul Nutt’s analysis of over four hundred decision situations.44 He identifies three common “blunders” and seven “traps.” His database includes some major and well-known blunders, such as the new airport at Denver – where the decision process contrasts very significantly with the airport DE services approach described in Chapter 3 – the ill-fated Euro Disney, the public relations fiasco of Shell Oil’s disposal of a North Sea oil platform and the Ford Pinto recall, or rather non-recall, when its fuel tank was shown to be prone to explosion. The study is wellgrounded and highly detailed. It provides instructive principles for DE. In particular, Nutt focuses on what the executive decision model tends to downplay: the link from idea to action and execution. He comments that “the same factors are apparent in all the decision debacles”: rushing to judgment, misusing resources, and applying “failure-prone” tactics – ones that were used in two out of every three of the decisions that he studied. He found that “idea-driven” processes are highly prone to failure, whereas “discovery-driven” ones are fifty percent more likely to succeed. By idea-driven, Nutt means that there is a constant danger of pre-determined mindsets, beliefs and processes resulting in limited search, as “zealots” defend and talk up a pet idea and block out “irrelevant” other ideas. We note that the analytic tradition of modeling, simulation, economics and computer science are naturally idea-driven. At its extreme, this runs the risk of force-fitting decisions into a priori perspective of either the sponsor of the decision-making initiative or the experts brought into the process. One criticism of the tendency of OR/MS specialists to see their own area of practice as the solution before even knowing the problem was captured in the epigram “Have model, will travel.” The authors of a book, Winning Decisions45, built on both their own research and their roles in Shell’s famous scenario-based planning team, call the tendency to rush to pre-judgment “mis-framing.” They make the comment that “The perspectives through which we view the world limit the decision-making options we can see and influence how effectively we can “communicate” and sell those options to others.” 44 45
Idem [41]. J. Edward Russo and Paul J.H. Schoemaker, 2001, Winning Decisions
86
In addition to pre-judgment and mis-framing is what might be termed pre-solving. It is almost instinctive for experts in specific methods and computer tools to apply them in just about every situation. So, for example, our own bias is towards real-time visual simulation models. Some of our colleagues concentrate on optimization modeling, statistical forecasting, multi-criteria decision-making, or group facilitation. Such expertise tends to be focused and specialized. This can create a substantial problem for decision enhancement: bringing the solution before the problem is known. Idea-driven processes, mis-framing, prejudgment and pre-solving are barriers to the effectiveness of the rational executive decision model in that they undermine it right upfront. Nutt identifies how easily and often the leaders of a decision-making initiative fail to “set the stage” for the process. They exclude relevant stakeholders, provide ambiguous directions, and tend to establish the venture as a problem to be fixed rather than as a direction to explore. According to his detailed review of the literature on decision-making, including business, political science and social science studies, there is a general consensus that there are five main stages in the effective decision process as shown in Figure 4.1.
Figure 4.1: Five Stages of Effective Decision Process
Many organizational and technical tools contribute to one or more of these but neglect or even fail in others. Technology specialists rarely address Stage 5; Organizational processes limit Stages 1-3; political processes often block Stage 3 and prejudge options. Nutt discusses the difference between establishing the direction for a decision-making process versus viewing it as a problem to be fixed through a choice of action. By “direction” he means the target results from the process: ensuring that all relevant stakeholders and their
87
“claims” – a priori positions, values, priorities and objectives – are included from the start, uncovering and respecting self-interest and ethical issues, and relying on exploration and discovery rather than the typical idea-driven and often idea-imposition process he sees as widespread. His concluding comment is about the futility of persuasion. Much of information technology is used to persuade: results from models presented as expert opinion and recommendations, spreadsheets presented as truths and multimedia used to dazzle and convince. Nutt’s comments here are a strong warning for Decision Enhancement: involve not persuade. The sequence of stages that Nutt identifies as characteristics of the effective decision process are logical, rigorous and sequential: collect information, establish an agreed on direction, make a systematic search for ideas and options, evaluate the ideas, and manage the social and political barriers. Many systems builders would argue that this is exactly what they do. The tradition of the executive as decision-maker embodied in the Carnegie School offers similarly disciplined sequences.46 Where Nutt’s analysis differs from the rational model is in stressing how much has to occur before the analytic steps begin. In particular, the social process must bring in potential dissenters, not exclude them. This requirement in itself points to the difference between a system and a studio. The studio embodies both the information, search evaluation and choice rationale – the “script” – and the scene-setting. The challenge here is to balance the intellectual and the emotive. Russo and Schoemaker, the two authors of Winning Decisions, acknowledge the many organizational barriers to the use of analytic methods and scenarios/simulations. The benefits of experience-based intuition and established group processes are easy to recognize and produce fast results. Decision-makers are often skeptical about the value of shifting how they work and think and have trouble seeing where and how more sophisticated and formal methods can help. Too often, models take on an identity of their own and become almost a competitor rather than a support to their natural modes of decision-making. Russo and Schoemaker point towards the DE services approach when they recommend presenting models and tools as “an encapsulation of the best of the decision-makers’ own considerable experience” and emphasizing that the model cannot run without their inputs and may at any point in the process be overridden by them. They bring out the importance of managing conflicts of values, priorities and interests: “It is how well this is managed that determines how well decision-making groups emerge…… The right kind of conflict can drive innovation.”
46
Simon, H.A. “Administrative Behavior”, Free Press, 4th Edition, 1997; March, J.G. “Primer on Decision Making: How Decisions Happen”, Free Press, 1994.
88
Sowell makes exactly the same point in how knowledge is effectively mobilized in group decision-making. Whereas professionals in the analytic and technical fields tend to think of information as facts, data, objective truths and the like and seek ways of building, accessing and interpreting information, Sowell and many others in the social and political sciences focus on how to make decisions in the context of “one of the most severe constraints facing human beings in all societies and throughout history – inadequate knowledge for making all the decisions that each individual and every organization has to make.” He asks, “How does an ignorant world perform intricate functions requiring enormous knowledge?” He argues in favor of processes that focus on bringing in divergent stakeholders and facilitating incremental trade-offs among alternatives. He contrasts this with the search for “solutions.” “In a world where people are preoccupied with arguing about what decision should be made on a sweeping range of issues, this book argues that the most fundamental issue is not what decision to make but who is to make it – through what processes and under what incentives and constraints, and with what feedback mechanisms to correct the decision if it turns out to be wrong.” While Sowell takes a very different view of the nature, purpose and context of the other authors, he parallels many of their recommendations. Here are examples: ¾Russo and Schoemaker: Take time to decide how to decide: a good initial assessment is crucial. Create new shared frames; invite dissenters into the process. Do not take sides too soon. Reduce the pressure to conform. Establish norms supporting conflict and creativity. ¾Nutt: Reconcile stakeholder claims: interests, positions, and concerns. Get people to disclose their interests. Establish a direction that points to the needed direction of the process, not to the nature of a solution. Exploring involves discovery: effective decision-making involves collecting information that reveals possibilities. ¾Sowell: Begin with the production of knowledge – with the process by which ideas are filtered and transformed into recognized knowledge, having the force to guide decisions.
89
What is intriguing about the analyses and normative recommendations of all these books is how well they fit into the simple concept of DE studios once the focus on systems is shifted to a focus on the process within which the systems fit.
A Road Map for Effective Decision-making Shown below (Figure 4.2) is the overall picture of effective decision processes that emerges from the literature and in-depth studies of individual cases and that provides a road map for DE practice. If we can balance the solution strengths of the executive management/analytic models and the interaction strengths of the scenario/stakeholder models, we can move from decision support as system to process. This is quite a challenge, of course.
Figure 4.2: Roadmap of Decision-making
The analogy of a road map is useful in that it summons up images of a journey. The journey begins with the decision to decide to decide. It ends with the arrival at a base camp that is the jumping off point for
90
taking action to implement a decision. The analogy also summons up literal images: visual maps for orientation, navigation, route selection and progress-checking. A system is not a road map. A process requires a map. The final value of the journey metaphor is that it helps place systems, models and methods in their most appropriate context. They are the equipment and navigation aids. Here in summary are the road map principles for designing an effective decision process with succinct comments about their implications for effective decision support.
Orientation and getting started Principle 1: Define the direction of the decision process journey, not the specific solution needed. DE implications: Decision Enhancement has to be linked to the leadership processes that establish vision, direction, attention, strategy and priority. Decisions, especially decisions that matter, are not made in technical and intellectual isolation. Of all the challenges for DE, this is probably the most important and difficult. Vision and leadership shape the direction for addressing decisions that matter. The routine and recurrent ones that are more procedures than decisions are temptingly easy opportunities to apply interactive information technology tools – systems; the navigation path is well-signposted and the steps wellunderstood. They are the domain not of Decision Enhancement but of standard and straightforward uses of PC software, the web, CRM, ERP, workflow automation and the like. If the road map is already marked out, Decision Enhancement services are not needed. If the journey is uncertain, complex, risky, new, exploratory, and into unknown territory then DE must focus its methods and tools on direction-setting. Principle 2: Bring into the process all relevant stakeholders, not just the inner circle of executives and/or the expert planners and advisers. There is evidence that a key to group effectiveness in decision-making is diversity and independence of viewpoints. Groupthink is a commonplace result of lack of these. The mis-framing and rush to judgment that we referenced earlier as a prime reason why decisions fail are mitigated by diversity. James Sukoweicki makes a compelling case for “the wisdom of crowds” when this is the case.47 The studio is an invitational forum for diversity.
47
Sukoweicki, J. “The Wisdom of Crowds”, Doubleday, 2004.
91
DE implications: The studio must be designed right from the start to facilitate sharing and collaboration among diverse parties not participation among ones that bring an “assumed consensus” to the process. Generally, it is far faster and easier to get agreement by excluding parties and that is sometimes appropriate but in general this has later costs and drawbacks. One of the strengths of Decision Enhancement research and practice has been to exploit tools that facilitate brainstorming, maintain privacy and anonymity where that helps people come forward instead of stay in hiding, and provide simple and fast ways for groups to propose ideas, vote, prioritize and comment on alternatives. Principle 3: Spend time and resources on orientation and broad searching for information that provides new guidance and helps move in the right direction. DE implications: “What if” analysis remains a core of decision support. Visual simulation tools help people explore the opportunity and information space in their own style. Decision Enhancement services cannot be based on “have model, will travel” or on “here is the tool; now let’s find the solution.” The studio is a space for discovery, thinking things up and experimentation. Principle 4: Encourage open argument and explore differences of priority, interests, values and interpretations of information. DES implications: The DE studio is a social space as well as an intellectual one. Models and systems are not enough to ensure that this is so.
Getting moving Principle 5: Shift from the orientation mode of exploration and individual perspectives to the discipline of analysis, evaluation and choice. The effective decision process begins with diversity of viewpoints and independence of spirit – i.e., participants are not afraid to speak up because there boss might be upset or too timid to offer an unusual idea. This is the divergence stage. One of the key skills in facilitating group work is to ensure, first, the divergence and then the management of the shift to convergence. DE implications: Use the very best analytic methods, models and computer tools. Recognize that people prefer, enjoy and cooperate in the open-ended brainstorming and interactions of scenarios of a studio built around orientation and scenarios and often, perhaps even
92
generally, resist moving on to the tough work of analysis and evaluation described in the quotation earlier that “Computer models do not make the job of decision-making any easier. They make it harder. They enforce rigorous thinking and expose fallacies in mental models we have always been proud of.” Ensure both usefulness and usability of DE simulations, information tools and collaboration aids. In particular, exploit the opportunities of new generation multimedia software development and interfaces.
Getting to the end point Principle 6: Build every element of the decision-making process to maximize the likelihood of getting a commitment from all the stakeholders involved. DE implications: Recognize that usability dominates usefulness in the decision closure stages and use dominates usability. If there is any lack of appropriate involvement, no shared understanding, unresolved conflicts, no confidence in the process and tools, or incomplete orientation, evaluation and willingness to commit, then there will be no real decision; the result of the process will be a strategy, consensual agreement, vision, forecast, scenario or recommendation, but it won’t be a decision.
Conclusion Decision Enhancement as a field of research and practice is only as strong as its understanding of decision-making from both a descriptive viewpoint – this is how people actually make decisions – and a normative view – here’s a better way for them to make decisions. It has to be stated fairly bluntly here that the analytic and technical fields have too often highly limited, rigid and often questionable normative views and are indifferent, incurious and naïve in their understanding of the descriptive perspective. Equally bluntly, we need to state our view that the fields that are strong in their understanding of how decisions get made are equally often indifferent, incurious and naïve about the opportunity and value of technology and analytic tools. An old adage about the difference between managers and planners is relevant here. It states that planners are people who think before they act, if they ever do act and managers are people who act before they think, if they ever do think. Decision Enhancement is the discipline of combining thought to help action and action to guide thought.
93
Our survey of what we really know about decisions and decision-making is of necessity cursory. We suggest that in the context of decisions that matter that DES simply must be built on a sound understanding of both the descriptive and normative perspectives on decision processes. We add as an appendix to this chapter two more case vignettes, one that shows the strength of the analytic approach to decision-making and another that describes the fusion of the facilitative and analytic in a studio.
94
Vignette 4: Fact-based decision-making Baseball may not seem to be a topic relevant to business decisionmaking and Decision Enhancement, but a 2003 best-seller, Moneyball: The Art of Winning an Unfair Game48, throws immense light on the extent to which human judgment, experience and culture can impede not enhance effective decisions. This is a theme that runs through many articles and books about the limits of human relationality. This book is well worth reading. It has also changed the management of baseball. Moneyball tells the story of how a new general manager for the Oakland Athletics used statistics to move his team to a continually strong position in the league while being constrained to having one of the lowest payrolls in baseball by lack of funds, revenues and market size. In 1999, the team ranked eleventh out of fourteen in payroll and fifth in wins. In 2000 and 2001, it moved up to second place in wins and twelfth in payroll. In 2002, it stayed twelfth in payroll but led both leagues in wins. In 2003, his team won its division and competed in the playoffs against two teams that had each spent a fortune to build their staff – more than twice the As’ total payroll. Baseball is obviously a talentbased business and stars are eagerly sought out and rewarded with massive deals; the biggest spender, the New York Yankees, took on one player’s contract of over $100 million. The As could not afford these. Billy Beane, the hero of the book, achieved the As’ success by ignoring and even defying just about every element of conventional wisdom about rating players, team strategy, drafting new players and contracting with free agents. The draft gives teams the right to select high school and college players in reverse order of their ranking in wins last season. Free agents auction their services when their contracts with their current teams expire. Teams trade players between each other. Even at the high school level, “phenoms” are in the national headlines and every team has a cadre of scouts – usually ex-players – traveling constantly to locate and track hidden talent. But they largely get it wrong and Billy Beane gets it right. He relies on statistics to show scientifically the likely contribution a player will make to team performance. In many instances, the facts dispel long-held beliefs. Many of those are subjective; Beane dismisses this and insists on looking at 48
Michael Lewis, “Moneyball: The Art of Winning an Unfair Game”, W.W. Norton & Company, 2003.
95
the facts. He found that players drafted out of high school, however outstanding their record, rarely performed as well as ones four years older drafted from university. So, he never picks them. Many of the figures that are tracked on a daily basis about individual player performance turn out to be misleading; the star with the highest batting average, for instance, or the ones that drive in most runs may well be hurting not helping their team. (Read the book to find out why.) Beane was heavily influenced in his innovations by Bill James, a maverick whom baseball management had dismissed for years, but whose “sabermetrics” – his term for the massive data he collected and interpreted – had made him a cult figure among baseball enthusiasts. “But baseball’s executives and executives treated James’s work as irrelevant. He had no effect on what they did.” Beane hired James and also recruited a young economist with no experience in baseball and who basically lived on his computer. This young man was consistently able to predict players’ performance without every seeing them in action. As a result, Beane was able to draft players that other teams rated low in talent, make trades where he got bargains, and spot young talent that other teams were ready to discard, and do so again and again and again. “The verdict? Statistical measures outperform experts. It’s not even close…… Why do professional baseball executives, many of whom have spent their lives in the game, make so many mistakes? They are paid well, and they are specialists. They have every incentive to evaluate talent correctly. So why do they blunder?” In an intriguing passage, Lewis offers three clues. First, those who played the game seem to overgeneralize from personal experience. ‘People always thought their own experience was typical when it wasn’t.’ Second, the professionals were unduly affected by how a player had performed most recently, even though recent performance is not always a good guide. Third, people were biased by what they saw, or thought they saw, with their own eyes.” The New Republic book review of Moneyball makes a direct connection here to business decision-making49. It cites the work of two cognitive psychologists that shows how people repeatedly make errors in statistical reasoning. They show predictable biases, rely on simple rules of thumb, habit, cultural context and what other experts seem to believe. “Even when the stakes are high, rational behavior does not always emerge.”
49
Source: Richard Thaler and Cass Sunstein, Who’s on First, The New Republic, September 1, 2003, pp. 27-30
96
The book is not a paean to statistics. Indeed, much of Bill James’ and the As’ management work attacks “bad” statistics, such as the “save” where a closing pitcher gets credit and often a big new contract next year, when he comes in towards the end of a game where his team is ahead and he preserves the lead. Not only is this a dumb statistic but it has greatly influenced how teams have used their pitching staff. James demolishes standard practice and points to far more effective decision choices. Beane also exploited this mis-statistic to artificially boost the apparent performance of a ”closer” and then sell him off. Much of Moneyball addresses the difficulties that Beane and his CEO faced in pushing organizational change down through the ranks. It also emphasizes the decision edge the team has created. Here is a marketplace where the data used to evaluate players is about the most plentiful, detailed and objective as in any field. The labor market should be among the most efficient in the world. It isn’t, and Beane has exploited this “asymmetry” of not information but of the use of information to create a sustained competitive edge. His statistical toolkit is a DE service in spirit and impact, if not in specific labeling. One of the disciplines of the fact-based approach to baseball was the 28year old Theo Epstein, who becomes the youngest manager of a professional team in the league’s 150-year history. The team is the Boston Red Sox, which after 86 years of never winning the championship finally ended the “curse” on it attributed to selling the greatest hitter of all time to the Red Sox’s rival – which the team’s owner described as “The Evil Empire” – in 1919. It is generally agreed that Theo made all the right decisions on personnel – because he had the facts to guide him and the guts to make the commitments.
97
Vignette 5: A group studio Group Support Systems were one of the major initiatives in the 1980s and early 1990s to exploit technology for collaboration. GSS – the later term for GDSS – “emerged as a rich, flexible toolbox to facilitators to help move a group towards a goal. (In passing, we point to the dropping of the “D”, a signal of what we see as a limitation in the impact of group support – the loss of a decision focus.) Research shows that groups using GSS can be far more productive than teams using other means to accomplish their tasks. However, experience in the field suggests that organizations do not tend to become self-sustaining with GSS until they incorporate the technology into their daily work practices in support of mission critical tasks that are conducted over and over again by the practitioners themselves, rather than under the guidance of an outside facilitator. This suggests a new, perhaps higher role for GSS facilitators: to create and leave behind well-crafted, well-tested repeatable processes for others to execute on their own.”50 WinWin embodies such an approach. It was designed as a GSS service, a set of principles, practices and tools, to enable a set of interdependent stakeholders to come to a mutually satisfactory win-win set of shared commitments. WinWin means that people do not get all that they want but can be reasonably assured they will get what was agreed to; it rests on effective negotiations that lead to commitments. Win-lose does not work in this context, though it marks many negotiation processes. If the solution is a quick, cheap and sloppy product, the developer and ARPA customer “win” but the end-user loses. Adding bells and whistles is a win only for the customer and driving too hard a bargain hurts the developer. In the end, every party loses. Sloppy products hurt the developer’s reputation and lead to rework which adds to the customer’s cost and the user’s frustration. Bells and whistles lead to complexity, missed deadlines and budget-exhausting cost overruns. Low-ball bids translate to poor products.
50 Gert-Jan de Vreede and Robert Briggs, Group Support Systems Patterns: ThinkLets and Methodologies, Proceedings of the 36th Hawaii International Conference on Systems Sciences, 2003
98
The WinWin requirements approach was used in the U.S. Defense Department’s $100 million STARS program.51 This handled negotiations and decisions for software systems requirements. It depended heavily on monthly meetings among many stakeholders: three prime contractors and their three commercial counterparts, the user representatives from the Navy, Army and Air Force, DARPA customers, and several research and support contractors. “Each meeting would end up with an agreement that felt like three steps forward. However, by the next meeting, we would take two steps back, as the distributed stakeholders independently “re-interpreted” the agreements.” As a result, it took six months to achieve a shared vision documented by the prime contractors’ success plans. Analysis suggested that this could have been cut to 1-2 months. The original WinWin methodology was manual and facilitative. The guiding principle was that stakeholders express their goals as win conditions. If all parties concur, the win condition becomes an agreement. When they disagree with each other, they register their conflicts as issues. Stakeholders then invent options for mutual gain and explore their trade-offs. This is an iterative process that ends when all stakeholders concur. A domain taxonomy – basically a map – is created together with a glossary of terms to organize the discussion and ensure shared understanding. A new Groupware system evolved over four “generations.” The first was a typical prototype, which “was useful enough for demonstrations and for an initial experiment.” It involved role-playing negotiations about the requirements for a better future WinWin. Generation 2 was a software system that systematized the negotiations and tracked progress and discussion. It was used “experimentally” but not by many of the stakeholders. It is described as strong in vision, but not-so-strong in technical architecture. “We had underestimated how much detailed software engineering was needed to get from a shared groupware vision to a group support system.” This lesson led to the third generation system being built on a “muscle-bound” architecture. It included a formal negotiation model, uniform “look and feel” graphical user interface, and firm rules for using the negotiation model. It provided facilities for voting and for attaching documents and analytic tools such as spreadsheets, plus aids to “big-picture” visualization and navigation. It was less than a major success: too inflexible to adapt to different negotiation situations, not robust and overly strict in driving the process.
51 Barry Boehm, Paul Grunbacher and Robert Briggs, Lessons Learned from Four Generations of Groupware for Requirements Negotiation, IEEE Software, May/June 2001, pp. 46-55.
99
Finally, a fourth generation was built on an established commercial product that balances structure and flexibility and on lessons learnt from a wide range of groupware experiences. It is highly process-centered, aiming at moving stakeholders through five basic “patterns” of thinking: ¾Diverge: Move from having few ideas to more ideas. Move from having ideas expressed in detail to their being expressed as fewer, more general concepts. The facilitator encourages exploration of the terms and their implications. In this first stage of the process, stakeholders jointly refine and customize an outline of negotiation topics, adding comments as they go. ¾Converge: Move from many ideas to focusing on just the few that are worthy of further attention. Move from having ideas expressed in less detail to more detail. Stakeholders propose and agree on initial definition of terms to be used in the negotiations. The facilitator reviews the comments together with the group and modifies the outline itself. That moves the group back into divergent brain-storming and anonymous discussion. The group then converges on key win conditions. ¾Organize: Move from less understanding to more understanding of the relationship among ideas. The next step is a structured discussion and categorization of win conditions into negotiation topics. ¾Evaluate: Move from less to more understanding of the value of concepts for accomplishing the task at hand. The group rates win conditions using two criteria: business importance for each stakeholder party and ease of realization, including perceived technical and economic constraints. ¾Build consensus: Move from less to more understanding of the diverse interests of the group and from less to more agreement about possible courses of action. The group surfaces and discusses issues, analyze their “prioritization poll” to reveal conflicts, constraints and differences in perception. The end point is a set of negotiated agreements. EasyWinWin, the resulting fourth generation studio, was used in about thirty real-world projects, as of 2002. The developers, who have about as long and strong a track record as researchers, facilitators and
100
systems builders as any in the world, present three main types of lesson from the evolution of EasyWinWin: methodology, groupware and project lessons. The first methodology lesson is that it is imperative to define a repeatable negotiation process. The first three generations of studio did not provide a strategy on how to carry out a concrete negotiation. “We found that a detailed process guide reduces variance in the quality of deliverables and helps lower skilled or less experienced facilitators to accomplish more than would be possible with straight stand-up facilitation.” Each step in the process needs to involve stakeholders in assimilating each others’ views in progressive steps. The challenge for the facilitator is to balance the rhythms of divergence and convergence, organization and evaluation in moving to agreement and commitment. The second methodology lesson was that the process and tools must emphasize group dynamics. “The first three generations of WinWin environments emphasized modeling constraints over group dynamics and collaboration support.” This blocked the emergence of “desired patterns” of group interaction. An example here is the role of anonymity in making contributions and voting on ballots. “People with power differentials can lay it on the line without threat to jobs, status, relationships, and political position. Increased openness also helps to get to root issues in a hurry. People up and down the hierarchy can find out what is really going on thus avoiding the Abilene Paradox (people agree to an unattractive option because they erroneously believe it will make the option-proposer happy).”52 Groupware lessons were that tools must be unobtrusive and flexible; the first three generations of WinWin were “systems” that were increasingly strict about enforcing modeling conventions at the expense of group dynamics. The software got in the way of human interactions that are critical to negotiation, Ease of use, robust infrastructure and interoperability between systems and information resources were key issues, as they always have been and always will be in human-computer interactions. The final conclusion from the lessons is that “Developers of group support systems should not expect to get them right the first time – or even the second time…. Even now, we are involved in a fifth iteration to provide better support to less-experienced facilitators. However, the payoffs are worth it; we have experienced about a factor of 4 improvements in multi-stakeholder requirements negotiation time when going from manual negotiation to the 2G-3G WinWin system; and another factor of 5 in going from 2G-3G WinWin to the 4G EasyWinWin system.” That is very impressive. 52
This paradox is the subject of a best-selling management book by Jerry Harvey, The Abilene Paradox and Other Thoughts on Management, 1996, Jossey-Bass.
101
Chapter 5
Visual Thinking: If You Can’t See It, You Can’t Get It Overview: Visual Technology is the Means, Not the End Our purpose in this chapter is very much the same as for Chapter 4: to step back and ask and answer a basic question. In this instance it is: What do we really know about the link between visual thinking and visual tools in effective decision-making? The question is a key one for the effective design of interactive DE suites, for the obvious reason that if the first wave of personal computers was the era of numbers and that of the Internet the decade of images – web pages that look like the pages in a magazine – then the rapidly emerging future is the visual age – any and every type of multimedia. The question is also a key one for Decision Enhancement for a not so obvious reason: the evidence is strong that visualization is at the very core of human thinking. This means that it can be made the core of effective decision processes. In particular, it is the base for rehearsing futures through powerful simulations, shared understanding and dynamic representations of complex phenomena. We begin our analysis with a review of visual technology from a Decision Enhancement perspective, present some long-standing findings about the psychology of visualization, and then provide some principles for DE services design. We end the chapter with two more case vignettes, one that meshes dynamic visual simulation modeling of the flow of automated ground vehicles at the Dutch Flower Auction – the conceptual picture – with physical operations – the actual picture – and another that reviews trends in the application of visual tools to handling masses of data to make it meaningful in decision situations. A distinction we make in our evaluation is between apprehension – getting it in the sense of, literally, “Ah yes, now I see what this means” to comprehension – getting it in the sense of “I really understand this
102
now.” Very, very roughly, visualization enhances apprehension and numeric data comprehension. One should not come at the expense of the other. The challenge and opportunity for DE services is to offer suites that permit a constant movement between the two.
Visual Technology The technology base for visual DE services is moving at an everaccelerating pace. The most striking innovations in information technology of the last decade have been in multimedia, enabled through the combination of massive computing power literally at the fingertips, dramatically improved screen displays, and the ingenuities and efficiencies of consumer electronics design and manufacturing. Innovation cycle times in new multimedia products have been cut to months, as anyone who has recently bought an MP3 music device, digital camera or video game knows and appreciates. Perhaps the most far-reaching of all developments has been the transformation of films, where the distinction between fantasy and realism has disappeared in Lord of the Rings, Harry Potter, the Matrix series, House of the Flying Daggers and The Incredibles. This transformation is built on exactly the same technology base as the core of DE suites, albeit with far larger budgets. “In one week, we can produce the number of sequences for a movie that typically used to take us a year and with the same or better quality…..Two factors…. are contributing to the new ease and lower costs of screen animation: first, commodity hardware of unprecedented power and second, the Linux operating system – the open source software that customers increasingly favor over proprietary systems.”53 This quotation is as applicable to DE as to movies. At a more everyday and less expensive level, the Internet – which is a core element of DE suites – is now very much a multimedia machine struggling to burst out of the bounds of relatively slow telecommunications links – networked multimedia demands far more bandwidth than is available for most consumers and small businesses. This is why the main impacts of multimedia have been on localized devices, such as game machines. However, every improvement in the speed of Internet access links opens up new multimedia opportunities that are quickly filled. The Napster phenomenon, whereby until the music recording industry struck back and imposed legal restrictions, songs were downloaded for free, showed 53
Alan Cane, Lights. Action. Animate, Financial Times IT Review, July 7, 2004, page 3
103
the scale of demand. Highly sophisticated MMOGs (massive multiplayer online (role playing) games) have added networked interactivity among people to the standalone game machine or personal computer interaction with a single player. This progress is illustrated in the evolution of the beautiful, mysterious (and nonviolent) game, Myst, which was launched in 1994 and that “changed the way people think about games” (Wired, June 2003).54 The game design company exploited the increased speed and power of standalone PCs to add realtime capabilities in Myst Real. In 2003, it announced Uru: Ages Beyond Myst: “The 3-D game taps into the Internet’s virtual community…… Players can venture in alone or meet up with friends online to solve puzzles, play chess, or go on scavenger hunts, communicating in realtime via instant messaging via voice over IP.” (IP stands for Internet Protocol, the core of Web-based technology. VOIP is IP-based phone calls.) The quotes from Wired could be tailored to the technology side of Decision Enhancement: from PC to network, from single or face-to-face decision-making group to networks of actors, and from monochrome numeric displays to 3-D, real-time and all the rest of the best of dynamic visualization tools. In an sense, Myst is a decision-making tool in its game context and in another sense decision-making studios are a gaming environment; Defense Department war gaming is Myst with consequences. Both rehearse a future, whether a fantasy one or a military one; the tools can just as easily rehearse a business one. Conceptually, there is no difference between designing and using Simcity or designing the Hartsmartin airport of the future discussed in Chapter 3. Multimedia is the enabler of simulation and interaction in both instances. Ergo, DE services are multimedia in nature; while, for instance, many suites will include spreadsheets, probabilistic and statistical analyses, and extractions from databases, the interface between the user and such resources will – must – be multimedia and many of the displays of results will be multimedia in nature; animation is the obvious example. The main value of numbers is to be able to “drill down” from a broad and synthesizing visual display to the details that underlie them. But there is a huge catch in applying multimedia to Decision Enhancement and DE services cannot be directly equated with multimedia. Decision-making is paced by thinking – informationgathering, analysis, evaluation – and talking – brainstorming, questioning, communicating, collaborating. Displays in and of themselves do not necessarily help thinking. The multimedia of Myst or Simcity subtly and substantially constrains thinking – that is what 54
Suzanne Ashe, Exploring Myst’s Brave New World, Wired, June 2003, page 61
104
“games” are all about. They take over thinking in the sense that the powerful visualizations quite literally create a world of their own. Display then dominates; visual impact becomes the goal. As Edward Tufte, one of the leading authorities on the visual display of quantitative data, constantly states, most designers of tools that rely on visual displays, including Web pages, ignore content and concentrate on presentation – usability versus usefulness. “Marketing people insisting on including the oversized, animated corporate logo, Web designers packing in as many tables and figures as possible, programmers ignoring everyone else and implementing their own obscure interface.” Most Web pages, according to Tufte, reflect this political power struggle, with the content relegated to a tiny percentage of the screen.55 The need for effective DES studios is not to produce Web sites or make simulation models flashy but to provide what Ben Schneiderman, one of the best-known experts in usesystem interface design, ascribes to Leonardo da Vinci: “He integrated intuitive thinking with careful analysis.”56 So, Decision Enhancement faces a very complex challenge. It must move towards exploiting visualization tools and use of multimedia but must not do so by losing the focus on “content.” Content rests on understanding the thinking part of decision processes.
Visual Thinking, not just Visual Displays This rest of this chapter of Decision Enhancement Services reviews what we know about effective visual thinking. It is an “academic” chapter in that it mainly focuses on research and practice that are independent of technology and its applications but that define criteria for the design of useful and usable tools for visualization. There is absolutely no question that all DE suites should be built on such multimedia resources, which can make models and information resources far more usable and increase the likelihood of them being used by a huge factor over the previous generation of systems that mostly displayed monochrome graphs, numbers, and text. We do not need to discuss these tools in detail in Decision Enhancement Services, since there is a wealth of experience and expertise available in every area of multimedia, but it is essential to ensure that DE is firmly grounded in proven findings about the relationship between visualization and effective understanding, learning and thinking in psychology and the arts.
55
Eugene Eric Kim, Tufte on Visualizing Information, originally appearing as an “Op Ed” feature on the Doctor Dobbs magazine Web site, August 1997. 56 Patrick Dunn, Leonardo’s Laptop, Frontwheeldrive.com, 2002 (no specific date provided)
105
The key question to answer first is “What is the evidence that tools built for visualization improve the effectiveness of decision-making?” The fact that a Decision Enhancement suite uses multimedia – animation, 3-D, color graphics, video and audio – instead of the more widespread numeric displays and static images says little about its value. Indeed, many commentators stress the limitations of multimedia displays that are distracting, misleading, overwhelming and more cute than informative. The screen becomes a kaleidoscope instead of a decision aid. A succinct summary captures this; we add our own emphasis to the quotation: “Visual decision support tools communicate data on the basis of visual perception and human interaction. No matter how rich the database and how ingenious the technology under the surface, poor visualization can confuse the viewer, result in false conclusions, and ultimately lead to erroneous decisions.”57 The authors move on to make the positive point that information graphics “unleash” the brain’s visual analytic capabilities. There is scattered but consistent evidence that when these capabilities are appropriately unleashed, multimedia-based interactive tools do have an significant impact on aspects of decision-making, ones that are highly relevant to DES studios: learning and understanding. A review of several hundred case studies by one of the authors of Decision Enhancement Services found that multimedia halves learning time and increases retention by twenty to forty percent.58 Below are some examples; while they are not all directly related to decision-making processes, they all are highly relevant to them, in that so many aspects of effective decision-making require learning, experimentation, building of confidence, and sharing of understanding. Arie de Geus, the head of Shell Oil Company’s legendary scenario planning studios, emphasizes this: “Learning assumes not knowing the answer, which makes people feel vulnerable – that they are taking a risk. Trust within the group is the foundation of inventive teamwork……. Maximize intellectual output by establishing a learning process. The best learning takes play. Play has been defined as experimenting with a representation of reality that you have created yourself – rather than experimenting with reality itself. A team at play results in a satisfactory end result in measurably about one half of the time that it would take in a reality-based situation.”59
57 Mandana Samii, Making the Truth Visible – Designing Graphics for Visual Reporting Tools, www.sapdesignguidl.org, August 6, 2003 58 Keen, P.G.W., Business Multimedia Explained 59 Stuart Silverstone, Visual Thinking with Arie de Geus, Knowledge Management (destinationkm.com), September 1, 1999
106
That claim is well-supported by case evidence of the effectiveness of multimedia:60 ¾Design decision-making: Caterpillar uses virtual reality for workers to operate the controls of a simulated forkloader, cutting the time to build a physical prototype from anywhere between six months to a year to a week. Here, the simulation enables decision-making teams to get actively involved in role-playing – a “gaming” situation. ¾Trouble-shooting decision-making: companies such as AT&T, Xerox and Nortel report that multimedia tools help technicians walk through problem diagnosis. These provide them with schematic diagrams, scripts, photos and video and audio clips on what to do and when. Alyeska, an oil pipeline company similarly uses a multimedia system to help staff deal with emergency fire and leakage situations, guiding them in deciding which valves to open and close and which alarms to set off. ¾Court case decisions: “Forensic animation” is a wellestablished tool in more and more court cases, to the degree that it has become fairly contentious because of its effectiveness. It puts jurors almost literally in the same picture but at a risk: “Full-motion video simulations confer pseudomemories on jurors, giving them the feeling that they witnessed the event.” ¾Medical decisions: A multimedia tool for patients and their families to evaluate alternative prostate cancer treatments reduced the number choosing surgery by 30 percent. Follow-on studies of comparable decision tools for women considering chemotherapy for breast cancer carried out in the U.S., Canada and UK, concluded that the addition of this suite to the doctor’s consultation and for hands-on use by patients increased their satisfaction with the decision-making process very significantly, as measured by confidence levels, reduced “decisional conflict” and anxiety scores and reported higher level of involvement in the process.61
60 The data here is drawn from Peter G.W. Keen, Business Multimedia Explained, Harvard Business School Press, 1997, passim. 61 Elizabeth Murray et al, Randomized Controlled Trial of an Interactive Multimedia Decision Aid on Benign Prostate Hyertrophy in Primary Car, British Medical Journal, September 1, 2001, page 493
107
These examples show the substantive impacts of interactive multimedia in decision-related or decision-relevant contexts. The DE services challenge in building studios and suites is to exploit the vivid and emotive simulations of futures that multimedia enable by linking them to decision processes.
Apprehension versus Comprehension There are currently two extremes of approaches to the design of interactive decision tools that reflect the distinction between usefulness and usability. Decision Enhancement needs them both. The usability tradition sees multimedia as a key resource to supporting what we term apprehension. The usefulness tradition focuses on comprehension. The difference is apparent in television news versus a published report on the same topic. TV news sets the standard that DE service designers should aim for in the suites that they provide to studios. Consider how much information is communicated and how simply by, say, CNN or the BBC. Their studios use background graphics to support what the newsreader is describing, animation to communicate trends and dynamics, and video and audio. This means that they can get across in typically two minutes a wealth of information and grab attention and interest. The problem is that this information is too often not really processed by the listener. Consider a TV documentary. It is not at all infrequent for a viewer to enthuse about it and tell friends how interesting it was. Ask him or her what it was about and typically the viewer will remember just a few broad ideas, some specific examples and that will be it. The very power of multimedia in helping people apprehend can be a blockage to comprehension. It can lead to a widely reported overconfidence among decision-makers who feel that they have a better handle on the situation and better understanding than they really do. This finding of the “illusion of control” applies to decision support systems, expert systems, information access systems, spreadsheets and simulation models.62 Contrast this with a report on the same topic as the TV program, say, on the Hartsman airport expansion that we discussed in Chapter 3. Assume that that the report has been produced for a hotel chain that needs to decide whether or not to extend, relocate or reduce its presence at the airport. Obviously, the report will include many figures, detailed analyses, and graphics. It will be designed for comprehension, with
62
For a summary of the literature on decision confidence and its correlation with decision quality (mis)calibration, see George Kasper, A Theory of Decision Support System Design for User Calibration, Information Systems Research Journal, Volume 7, No. 2, June 1996, pp. 215-232
108
appendices, footnotes, tables and the like. It will not provide the same big picture viewpoint that the TV program offers, just as TV will limit the level of detail and volume of information the report contains. It will, though, provide plenty of “content” as contrasted to the “context” focus of multimedia design. Apprehension and comprehension need not be in conflict, though in practice most decision tools focus on one or the other extreme. One of the principles for DE suite designs is to provide for mulple levels of information presentation and interaction. In the Hartsman simulation, for instance, seeing it and getting it are very different for policy makers versus operations managers or for airline financial planners versus financial investors. The early studios built by the TU Delft teams for airport planning lacked the layering of displays needed to meet the preferences of these diverse parties. In general, operations managers and financial planners found animations very helpful in orienting the discussions and in making sure all the parties had the same understanding. But they wanted to move on to the “numbers.” They needed to be confident that the animations were based on accurate, reliable and precise data. The simulation modeling tools available to the suite builders all operated at one level of detail; to add higher level visual displays or lower level spreadsheet and database access capabilities meant building new software features, at a cost of time, money and loss of interaction between decision makers, suite builders and studio facilitators. Most of the problems in meshing the usable and the useful, apprehension and comprehension and visual and numeric reflect the Madison Avenue school of design that grew up around the Web; the culture of interactive multimedia visual design is almost orthogonal to that of decision technologies. This is not a flip comment or a disdainful one. The original Internet was mainly used by technical and analytic professionals, most obviously engineers, scientific researchers and computer scientists. The visualizations they used were highly specialized and targeted to providing them with the maximum information, not the easiest displays to apprehend. Seismic data in oil exploration and X-rays are examples. The now somewhat archaic term “The Information Superhighway” was highly applicable for this first era of the Internet. The Web was built on its information movement capabilities – the Hypertext Transfer Protocol that remains the suffix before the www of a website address: “http:” In other words the Web is a subset of http which is a service within the Internet. The rapid growth of Web sites and traffic made the information metaphor less and less meaningful. The battle was for “eyeballs” and “hits.” Web sites had to grab attention in just a few seconds; consumers
109
would move on to another if there were any delays. The use of multimedia on the Web was at first very limited because of the low speed of dial-up connections and the limited capacity and power of PCs. The Madison Avenue analogy highlights the degree to which Web site designers used every trick of the development trade to pack as much visual impact as they could onto the display. Visual impact is not at all the same as visual thinking. The Web has had relatively little impact to date on DE or on the longer-established DSS field. Computer science, management science, statistical methods and expert systems have equally limited impact on web usage in decision processes. For the visual simulations that are the core of DES to be made useful and usable and to be actually used, we need a new generation of designers. The distinction between visual display and visual thinking can be illustrated by a “thematic map” created by a French engineer, Charles Joseph Minard, in 1861. It is probably the most famous statistical graphic ever produced. It shows the sequence of destruction of Napoleon’s army in his invasion of Russia in 1812; he started out with an army of 422,000; 100,000 reached Moscow and only 10,000 returned to Poland. Minard’s schematic was rightly described by one commentator as seeming to defy the pen of the historian by its “brutal eloquence.”63 Here is Edward Tufte’s summary: “Minard’s graphic tells a rich, coherent, story with its multivariate data, far more enlightening than just a single number bouncing along over time. Six variables are plotted: the size of the army, its location on a two-dimensional surface, direction of the army’s movement, and temperatures on various dates during the retreat from Moscow. It may well be the best statistical graphic ever drawn.” It is shown below, with no comment or explanation on our part. Even though it is in French, in black and white, and literally sketchy, it is selfexplanatory and aids both apprehension and comprehension. More relevant to visual thinking is that it is not in any way a visual rendering of reality – it does not “copy” anything. It is not a “simulation” in the way in which, say, an animation of a traffic accident used in a court case is. It is designed to help perceive what happened in those terrible months and give life to the numbers and provide a frame of reference to the time and geography. See it, then think about it. Minard’s graphic gives the viewer so much to see and think about. So must DE. Tufte uses Minard to illustrate two general principles: “Good design has two
63
This discussion and references are taken from Edward Tufte, The Visual Display of Quantitative Information, Graphics Press, 1983, pp. 40-41.
110
key elements: graphical excellence is often found in simplicity of design and complexity of data.”64
Figure 5.1: Minard’s graphic of Napoleon’s March to Moscow65
The Psychology of Visual Thinking Useful starting points for discussing visual thinking are provided by two very influential authors whose work excludes technology: Rudolf Arnheim, the author of Visual Thinking, 1969, a professor of art at Harvard and a psychologist, and Edward Tufte, whose Visualizing Quantitative Data, 1983, remains the classic on the subject. Tufte’s work is highly respected by many designers of software systems, but as with Arnheim his focus is on visualization not visualization technology. For both authors, the message is clear. Visual thinking is not a matter of visual images: visualization is the outcome of appropriate images. Appropriate images facilitate categorization, comparison, conceptualization, abstraction and insight into the dynamics of the decision domain. This means that the choice of representations is a key element of DE services design.
64 Edward R. Tufte, “The Visual Display of Quantitative Information, Graphics Press, 2001, page 177 65 Source: public domain image, dating from 1861
111
That importance can be illustrated in a business context through two very different conceptions of business value: Michael Porter’s wellknown value chain that is probably the most dominant modern influence on business strategy and the more recent emphasis on value networks. The value chain is shown below (Figure 5.2); it is not a “picture” but the embodiment of a way of thinking about business. It is linear, with the company driving the sequence from inbound logistics to marketing and sales and after-sale logistics; note that the customer does not drive marketing and sales in the reverse direction nor do they drive operations. This surely stands in contrast with the conceptualization of business as being increasingly customer-centric that underlies most thinking about electronic commerce. The display is uni-dimensional and very tidy. Basically, the value chain is a command and control model of industry-driven competition.
Figure 5.2: The Value Chain66
Contrast the value chain with the diagram below, taken from an issue of Scientific American in mid-2003.
66
Michael Porter, Competitive Advantage: Performance, New York: Free Press, 1985
Creating
and
Sustaining
Superior
112
Figure 5.3: Web Traffic Network67
This diagram captures the traffic flowing between a hundred thousand Web sites, a number that is just a tiny fraction of the total of who knows how many hundreds of millions. The larger the “starburst” of a site, the more it is a “power hub.” The article that presents this diagram discusses “scale-free” networks, ones like the Internet – and such companies as Yahoo – that, unlike such ‘hierarchical” networks as the passenger airport and telephone systems, can grow without constraints of structure. Value networks emerge out of the interactions of the customers who initiate the communications, the many websites that link to and from each other, and the power hubs that become the branded “portals” through which more and more traffic flows. One of the findings in the article, which is highly theoretical and mathematical, is that the dynamics of scale-free networks lead to the rich getting richer. Power hubs are created by the increase in the flow of traffic that moves through them, reducing the role of minor players to spokes; this process generates a power law, whereby more and more 67
Cite image credit as Internet Mapping Project of Lumeta Corporatiion, Feb 6 2003 (available under the terms of the GNU Free Documentation License)
113
connections move via the hubs at the expense of the spokes. The value chain model cannot capture this dynamic that is at the very core of electronic commerce68. The question for Decision Enhancement is not which representation – value chain or value network – is more correct, but what perceptions does it invoke and what and visual intelligence does it facilitate and encourage? The value chain pushes towards considerations of company strategy and integration of organization and operations. The value network conception highlights the dynamics of alliances, partnerships, customer behavior and complexity of channels and interactions; note that the value chain never mentions any of these and implies a single company, with fixed channels and relatively docile customers. The value network is decidedly not a command and control framework and is bottom-up not top-down in its implications for planning. It does not offer decision-makers much guidance in selecting opportunities and defining strategy. The value chain reflects the world of car makers, retailers and banks and the value network that of such companies as Yahoo!, Amazon and Google. Neither is more accurate than the other and neither is a model of “reality” but an abstraction from it. They each deeply, immediately and powerfully influence perceptions and hence thinking and hence decisionmaking. The point to make here is how different they look. Visual representations should guide thought. In Arnheim’s phrasing, visual thinking rests on the intelligence of visual perception. He sees the sense of sight as “the most efficient organ of human cognition.”69 That is a contentious point, of course, more so in the business and public policy world of decision-making than the fields of art and, to a lesser degree, science which are Arnheim’s own areas of focus. Many skilled experts in business, technical fields, academia and public policy would disagree with it and obviously, highlighting the importance of visualization is not to eliminate the need for other media, including traditional numeric- and text-based reports and displays. To a large degree business lives in two main media as the core of decision-making: verbal communication in teams, meetings and brainstorming sessions and numbers in analyses, reports, business plans and budgets. A common adage is that if you can’t measure it, you can’t manage it and metrics are very much at the core of many companies’ decision
68 One of its correlates is that dot coms which spent fortunes to attract transactions but failed to build the branding and power hub presence inevitably go broke, while spokes that are cost-efficient and specialized in their services can make money through their role in the overall value network. 69 Rudolf Arnheim, Visual Thinking, University of California Press, 1969, page vi.
114
governance – the oversight and guiding principles for decision processes at all levels of the organization. We appear to be in a transition period as a society in the balance between these different forms of interaction and organization of “information.” Historically, computer-based information came to mean numbers – “data.” Management information system processed numbers, databases stored numbers and more recently spreadsheets manipulated numbers. Non-computer information meant words: books, reports and verbal interactions. Even today, the everyday world of business decision-making is still dominated by people talking about data, computers producing numbers and analysts writing up results. Video and animation are largely seen as part of entertainment, training or communications (such as for investor relationships), not as core to Decision Enhancement studios. It will take time to change this; the social side of IT usage has always lagged well behind technology innovation, often because people lack trust in either the tools, tool builders or their own ability to use them. It took well over a decade for the PC to move from “managers don’t type” to “I have a PDA, laptop, mobile phone and I can’t live without e-mail.” It also took well over a decade for email to reach critical mass, with many systems in use in the early 1980s but with little sustained institutionalization and routinization until the mid-1990s, largely fueled by the Internet.70 Less than ten years ago, most executives were also very uncomfortable and awkward in using videoconferencing or appearing on video. All that has changed and the same will be the case for the visualization tools that support visual thinking. But the move will not be automatic; it will need very skilled design of studios and equally skilled design of suites. Alas, most suites do not help in this regard. Arnheim’s insights may not in themselves offer much guidance to DE suite designers as to how to build tools but they do offer pointers to studio designers on the criteria for effective suite design. His core point is that to “see” means to see in relations; relationships depend on structure. The appearance of any item in the visual field depends on its place and function in the total view. If a visual item is isolated from its context, it becomes a less comprehensible object. Here are a few of his comments on visual intelligence, with our own brief statements about their implication for DE suites and studios:
70 The first commercial e-mail services, most noticeably MCI Mail, appeared in the early 1980s, with many incompatible “proprietary” vendor products like IBM’s Profs, and in-house corporate systems. Many firms had successful e-mail pilots but within two years, the system fell into disuse. In general, technology change moves faster than we expect but the accompanying social change is much slower.
115
¾Humans search for patterns, analogies and similarities; these create a “unifying power”; the cognitive psychologist Jerome Bruner comments that “All perceptual experience is necessarily the end product of a categorization procedure.” The issue for DE studio and suite design is not what simulations or information resources to provide but what the most effective representations are that these should be built around. The spreadsheet representation of business, for instance, focuses visual intelligence on tables and resulting graphical displays. It is a very powerful representation and one that just about any manager is familiar with, comfortable in using and able to work with as a communication and collaboration tool. The spreadsheet representation is highly suitable for budgeting, financial planning and “what if?” analysis. It is less effective for exploring assumptions, addressing uncertainties, and including non-numeric considerations. Michael Schrage, whose book Serious Play views simulation as the core of business innovation (and who provided the phrase that is the subtitle of our book – rehearsing the future) discusses how the financial culture became a spreadsheet culture.71 He shows the degree to which the Wall Street merger and acquisitions boom relied on planning and management by spreadsheet. The “model” became the reality. Schrage quotes a European executive: “Our numbers are probably more important than our words.” Another expert told him that dueling spreadsheets were simply surrogates for competing, deeply held assumptions: “Our real battles were over the assumptions, not the models themselves.” Schrage adds his comment that what organizations choose not to model is at least as revealing as what they do choose. The Group DSS field took a very different view of decision-making from the spreadsheet culture and chose to model assumption-building and assumption-testing processes. Its representations of business rested on collaboration and generation of consensus. Hence, it stressed issues that the spreadsheet culture largely ignores: how to brainstorm options and scenarios, anonymous voting on steps in the decision process itself,
71
Michael Schrage, Tom Peters, “Serious Play: How the World’s Best Companies Simulate to Innovate”, Harvard Business School Press, 1999.
116
assessment of priorities and selection of targets of evaluation. Other DE services adopt different fundamental visualizations. The designers of the airport development simulation suites chose as their representation the flows of traffic through the airport. DE studios that focus on business process transformation choose to represent processes as flows of negotiations and commitments. Most BPR and TQM systems represent them as flows of information and work tasks. Unsurprisingly, these visual representations tend to encourage a focus on streamlining the steps in the process. Instead of defining themselves as experts in a particular tool or technology, such as simulation, which tends to mean that they naturally bring the same toolkit and supporting methods – and hence assumptions – to any situation, DE service developers should be thinking in terms of the perceptions they are aiming to enable. The simulation modeler’s question is how do we capture and represent the decision situation? The DE service designer’s is how do we find a visual mapping between the decision domain and the decision-makers’ perspective and mental frameworks? ¾Recognition – the “aha” factor – assigns an object a place in the system of things; even with a random, meaningless Rorschach inkblot, we respond with a recognition and interpretation. Cartoons are an example of recognition; we immediately spot that the red box with a smiling face and clouds of dust pouring out from under its oversized wheels represents a sports car. We do not, though, view it as a “model” of a car but as an abstraction of certain features of cars and their drivers – noise, speed, flash, acceleration, racing and the like. Similarly, political cartoons exaggerate and abstract features of the cartoonist’s view of say, President Bush or Clinton. Recognition is essential to visual intelligence in all domains. ¾There is an aesthetic element in all visual accounts attempted by humans; we look for “clarity”, “quality”, “structure” and “links.” Edward Tufte’s own work on Graphical Excellence and that of the many researchers, designers and software experts who have built on and extended it emphasizes not esthetics in the artistic sense – indeed,
117
Tufte criticizes over-attention to visual beauty – but in the mesh between function, the purpose, and form, the display. Minard’s graphic of the French invasion of and retreat from Russia is esthetic in that sense. Here is Tufte’s summary: ¾Images function as pictures, symbols and signs with meaning: a triangle may be a sign of danger, a picture of a mountain, or a symbol of hierarchy. Here, perhaps, is the single most important DE principle for meshing human thinking and information technology; perceptual images as simulation objects and vice versa; the use of familiar business icons as the main display element of simulation building blocks. ¾“Human thinking cannot go beyond the patterns supplied by the human experience.” ¾“Language plus imagery creates a fuller range of thought – “thought imagery” that achieves what dreams and paintings do not, because it can combine different and separate levels of abstraction in one sensory situation.” ¾Humans are visual, verbal, literate and numerate – in that order ¾Words quickly communication.
reach
limits
of
expression
and
Arnheim’s book was published in 1969, round about the time that the cursor was first implemented on a computer display, half a decade before color screens, and thirty years before multimedia became ubiquitous, cost effective and backed up by adequate computer and telecommunications capabilities. He addresses timeless aspects of human thinking. DE suite designers are largely influenced by timedependent technology developments of tools to support visual thinking. They can draw on a wealth of superb research, technology and case material on the display side of all this. Much of this is brilliant and highly detailed. Examples include work on graphic “antialising” to increase the apparent resolution of type and graphics on computer screens in order to come closer to the spatial resolution of lined artwork in magazines printed on coated paper, the use of windows, dialog boxes, screen buttons and menus in interface design, typography, depth cueing, the cultural implications across nations in the implicit meaning of colors, symbols and semiotics in the interface, etc. etc. These are all important elements of the design of DES interfaces but while the interface is the system it is not the purpose of the system. Poor interfaces almost
118
guarantee that a system will not be used voluntarily – and DE studios are very much voluntary for decision-making teams and stakeholders.
119
Vignette 6: “Grokking” the Infoviz The spreadsheet allowed people for the first time to “converse with the data.” Now, a new kind of software, albeit one with a long history of development behind it, is opening up new types of conversations with more and more data. Termed by geeks “infoviz”, its fundamental aim is to help users visualize immense amounts of data.72 The goal is not to analyze the data; that is a field in itself, particularly in medicine, weather forecasting, geophysics, astronomy etc., where visual manipulation of huge volumes of data is the best way to identify patterns and clarify meaning. “Most important, the search for further cost reductions is driving firms to use visualization tools. Having automated many of their business processes, companies now collect huge amounts of data that they want to analyze to gain a competitive edge. After rounds of layoffs, companies have fewer people to take complex decisions – a shortage that better tools can help to alleviate.” (Our own emphasis is added here.) A common example is software that builds cartographic maps to help visualize “shared information spaces”, such as catalogs, inventories and guides. People are comfortable with and familiar with maps, so that they provide a shared representation comparable to the shared and easy to comprehend representation provided by the spreadsheet table of rows and columns. Maps can pack a lot of information into the small space of the computer screen. For instance, the categories of an online directory appear as countries, with their surface showing how many Web sites can be found under each heading. The main sites may be shown as cities – “virtual landmarks.” Users can zoom in to get more detail and also to access numeric data, graphs, and any other data form which the maps are built. Other examples are “heat maps”, where value or important is represented by color – from cold to hot to searing. A Swedish company has built such heat maps for financial institutions to make trading decisions; the heat map tracks share prices, interest rates or exchange rates. The magazine, Smart Money, provides a heat map on its Web site that lets users track more than 500 shares at the same time. Colored 72
The Economist Technology Review, June 21, 2003, pp. 25-26
120
rectangles show a company’s market capitalization. The size shows the relative value, and colors show that the price has gone up (green) or down (red) or is unchanged (black.) When a user moves the PC cursor over a rectangle, a small panel containing more detailed information pops up. Users can also drill down into the data, bringing up, say, Web site links to research or trading services.
Figure 5.4: The Inxight interface73
73
Source: http://java.sun.com/products/jfc/tsc/sightings/S04/knowledge/inxight1.gif
121
Maps are a natural mode of representation for visualizing data. Another is less well-suited to everyday comprehension but extends the type of data, use and interpretation that can be superimposed on large data and diffuse resources. This is based more on helping navigate rather than just display. It uses trees or webs. The Visual Thesaurus is perhaps the best-known application. Users key in a word, which pulls up a tree-like display, as shown in Figure 5.4. This clusters related terms together. Again, the use can zoom in, access detailed information, get definitions and access synonyms. Another tree-based tool is “a virtual magnifying glass” that highlights part of a hierarchy, such as an employee’s name. Linked data, such as other employees, is concentrated at the edges so that users can indeed see the forest as well as the trees. Fifty IT firms offer this Inxight interface in their software products. Other representations are in early use. Grokker, the company that the Economist article refers to (“grok” is a word taken from a well-known science fiction book of the 1970s) displays results of queries as a collection of circles within a circle. A large circle indicates the main category. Clicking on it reveals more circles. Drilling down in this way will eventually get the user to a single document or data item.
Figure 5.5: The Grokker interface74 74
Source:http://java.sun.com/products/jfc/tsc/sightings/S21/grokker/g2_map_only_lg.jpg.
122
All this is exciting and promising – and clearly very useful in many decision domains. It is also built on PC and web-based software that is very usable. However, there is still a wide gap between usability and use. Users have proven slow to adopt new computer interfaces. While they may react enthusiastically when they first see visualization tools in action, they are wary of using them, largely because they must relearn how to do things – and how to see them.
123
Vignette 7: A Control System for Automated Ground Vehicles This vignette describes an unusual approach to modeling a complex control system via a combination of a visual simulation suite and a physical laboratory, with a dynamic interplay between the two. Changes to the simulation – rehearsing a new future – immediately show up in the operations of the laboratory and vice versa. The specific context is automated transport for moving cargo between the Dutch Flower Auction market and Schiphol Airport, but it is representative of many other domains, including transportation systems in general, supply chain management systems, and many manufacturing processes. These have in common the feature that they are all welldefined in their operations and operational requirements, but complex to predict in their interactions75. The decision challenge for Aalsmeer, the Flower Auction, and Schiphol airport, working collaboratively, is to design a rapid, reliable and costeffective transportation system using a proposed new rail terminal linking the two locations. Freight cargo handling of flowers is very timesensitive and the performance of the system is critical to the Dutch economy; this is the nation’s largest export. The new system involved many design decisions: a sustainable and flexible layout, efficient terminals, development of vehicle control systems, safety procedures and, most critical of all for the DE application, the automation of the movement of ground vehicles. In such contexts, simulation is mainly used in the analysis phase as an input to design. Implementation and run-time operations are detached from the simulation modeling exercise, almost literally; the model gets thrown away once implementation begins. The innovation was to develop a simulation that can be used across the entire engineering and decision cycle. AGVs – automated ground vehicles – carry cargo from the Flower Auction to the rail terminal and then on to the airport. Figure 5.6 below 75
Think of the New Jersey Turnpike at rush hour, on Sunday morning and after a New York Giants football game; this is an example of a complex control system.
124
shows three levels of detail for the system flow, all taken from the simulation suite:
Figure 5.6: Three levels of AGV system flow
Traditional simulations can model this system as a flow but the more consequential decision issue is to develop a control system for it. The goal is to get as close to real-time automated operations as possible. That means designing the system to capture and control the behaviors of the vehicle complex. New AGVs and control systems to handle them must be developed for the real-world system based on modeling and analysis. So why not combine the abstraction of modeling with the concreteness of physical prototypes? The DE team was commissioned to build an AGV test site in which small-scale but still quite large AGVs operate. They are linked to and literally driven by a Web-based simulation model. This is used to test concepts, technical design of the AGV control system, performance characteristics of the AGVs, and software features to be built into the AGVs so that they can operate automatically. This combines two worlds – the totally physical of the test site and the totally abstract of the simulation; virtual becomes real and real is virtual at the same time. It leads to some interesting and powerful decision process opportunities, ranging from simulation (test concepts) to realtime control (test the physical system) to emulation (test the AGV PLC – program logic control) to prototyping (run the physical system). In each instance, changes made to the physical system show up in the simulation and vice versa. The model can add, say, a new vehicle and immediately the test site AGVs respond. In the same way, the
125
simulation may test changes in such parameters as speed and safety mechanisms and the model and AGV system will respond immediately. In a way, the model becomes a design blueprint. The model runs policy and records the results. The model enables “mixed-mode” testing, as shown below.
Figure 5.7: Test Site AGVs
The test site AGVs are not tinker toys, though they are much smaller then the real vehicles; they are close to the size of a compact car. They move through the test site, which is around a hundred meters square, stopping at terminal stations. They use radio sensors to avoid crashing into each other. The PLC – program logic control – is a combination of software and hardware that manages their operations. Much of the design agenda is to develop this control automation capability. The AGVs are in effect controlled by the model and vice versa. This permits an interchange of real systems and simulations: the model simulates the designed system and its controls, the simulation also tests the control system on the test site reality, and the new control software in the AGVs tests the simulation.
126
PART III: DECISION ENHANCEMENT SERVICES Chapter 6
Making Simulation Part of the Executive Culture, Not just “Modeling” Fundamental to our conception of Decision Enhancement as a professional field of practice that makes a substantive and institutionalized contribution to decisions that matter is the role of simulation modeling suites and studios. By and large, most organizations make effective use of data and basic analytic tools such as CRM and ERP software, mainly through experts who synthesize and present the results. We take as a given that “information” will be used by executives in their decision processes. We see this as decision support, a relatively passive part of the wider decision process. The challenge for Decision Enhancement is to move the more proactive suites of simulation modeling into studios that enable collaboration among diverse stakeholders. This chapter provides guidelines for beginning this innovative shift in the role of technology in decisionmaking. Table 6.1: A poll result regarding the question: “Would you trust a computer to make strategic business decisions?”
Response Yes, definitely
(%) 4
Yes, but with reservations
13
Possibly
35
No, definitely not
48
127
Making the shift will not be at all easy. It will demand a cultural change that will take the same time and effort. A poll carried out in early 200476 asked the question: “Would you trust a computer to make strategic business decisions?” The results are shown in Table 6.1. This means that, statistically, if five stakeholders were to work together on a decision that matters, there is only about a 4 percent likelihood of all of them being willing to trust a computer model’s choice of solution. Fewer than one in five executives is actively willing to use such tools. A reasonable target for the DE practitioner would be to shift the 35 percent who are in the “possibly” category into the “yes, but…” one. It is very reasonable to managers to have reservations about the use of Decision Enhancement suites but those reservations can be reduced through the three main principles we recommend for a strategy to move simulation modeling into the executive suite. The first principle is selfevident and has been the main emphasis of our argument: focus on decisions that matter. The second is the main topic of this chapter: define and implement a technology architecture – part of decision governance – that is based on distributed building blocks. The third principle is built and institutionalized in DE studios; this is the topic of Chapter 8. The reason why we see the technology architecture as so critical to making simulation modeling part of the executive culture is reflected in the survey figures. If we break down the original survey question, “Would you trust a computer to make strategic business decisions?” into three more detailed ones, our “guesstimate” of the likely answers highlights the technology itself as the bottleneck to adoption, and hence the priority for governance. The questions are: ¾Do you see a role for computers in contributing to your decision-making in decisions that matter? ¾Are you willing to use a trusted facilitator and domain expert (in finance, marketing, industry specialization, etc.) to help guide your decision-making? ¾Are you comfortable in relying on computer models and analyses in that context? We would expect the answers to be far more positive for the first two questions than for the third. Here is a likely profile of responses, based on research findings and our own experience.
76
E-zine Poll 2004.
128
Table 6.2: Responses on more detailed questions regarding the role of technology and decision-making
Yes
Yes, but
Possibly
No
Would you trust a computer to make strategic business decisions?
4%
13%
35%
48%
Do you see a role for computers in contributing to your decision-making?
20%
40%
30%
10%
Are you willing to use a trusted facilitator and domain expert to help guide your decision-making?
30%
40%
35%
5%
Are you comfortable in relying on computer models and analyses in that context?
5%
25%
50%
20%
These figures are hypothetical and in your own organization the answers may be very different (it may be well worthwhile asking the question, of course, to get a sense of executive openness to the DE opportunity). Our recommendation is to mobilize DE services by focusing on the “yes, but….” and “possibly” respondents.
Simulation: Opportunities and Blockages to Executive Use If visual and animated simulation models were as easy and fast to build as spreadsheets, they would be used everywhere in business. A spreadsheet is in itself a simulation, of course: a projection, forecast, and scenario-evaluation. It relies on numbers as its inputs and organizes the numeric results as tables, graphs, and statistics. It can feed dynamic simulation models that rely less on numbers than mathematical algorithms and that display results as animations and video-quality images. The two styles of model work together well; most of the simulations that we have described in previous chapters and case vignettes draw on spreadsheets and data bases. Where DE simulation suites differ from spreadsheets is that, first of all, most executives do not see them in the same light as spreadsheets; they are not part of their daily experience. Second, today they are far more complex to build and require far more skills and involvement of technical specialists. By contrast, any financial analyst can quickly learn to build fairly elaborate spreadsheets, often drawing on pre-defined templates. The very essence of the spreadsheet was that it was the first
129
major software development application that fit on a PC and did not require a computer programmer to write “custom” code. The programmer has been taken out of the loop here and the “user” is in charge. The same is the case with Web site development. Packaged tools make it easy to develop a simple site and a whole industry has grown up in the past few years to host sites and provide support services. As a result of such everyday innovations, spreadsheets, the Web and other PC capabilities are routine for any manager. Simulation is not – as yet. It is thus easy to managers to be unaware of just how powerful and communicative simulation modeling has become. Consider, for example, the animations shown in Figure 6.1. They represent the commonplace, not the exception, in simulation displays. They meet our tests of supporting visual thinking. They provide both high realism and abstraction in the images; this is a powerful combination.
Figure 6.1: Sample animations from simulations
130
Realism and abstraction are the core of simulation tools. The image below is not a photograph but a fairly realistic computer-generated display. The two that follow it are more abstract; both have their value in rehearsing the future. The three come from the very same simulation model.
Figure 6.2: Santa Fe Environmental Simulation Center model Displays
This model was developed as part of a planning initiative for the city of Santa Fe, New Mexico, by the ESC (Environmental Simulation Center).77 The design philosophy is very close to our own Decision Enhancement studios and suites and is part of a rapidly emerging mainstream of new generation simulations that represents a similar jump as that from computer programming to packaged spreadsheet software: from customized models to building blocks that are assembled in suites and linked to information and communication resources. Building blocks are the core for a DE architecture because they permit the maximum practical. The simulation was sponsored by city planners for stakeholders and focus groups to assess different growth scenarios for the Southwest area of the urban complex “that has been developing with a scale and character at odds with the rest of the community.” The Center, like most providers of DE studios and suites, has built specialized capabilities and recipes, in this case focused on environmental planning, in the same way that our own colleagues have specialized in infrastructure planning, supply chain management and group decision services. This facilitates
77
http://www.simcenter.org/Projects/Santa_Fe/santa_fe.html
131
the development of domain-specific building blocks and accompanying expertise. The studio was in the form of workshops where stakeholders formally voted on their comfort level with a particular design strategy, in the end favoring a higher-density neighborhood center. The building blocks work in real-time, and draw on what are termed GIS (Geographic Information Systems), that are databases of images, many of which are gathered by satellites. These systems provide 3D visualization. The ECS building blocks have been used in a range of applications, all of which center on the same general decision-making domain: policy options in urban and regional planning. Just as spreadsheets are largely focused on financial planning, even though the tools could be applied to just about any “what if?” decision context that is numbers-based (the Oakland A’s baseball team uses plenty of spreadsheets to project player performance, for instance), simulation is domain-limited. Which means that, as with both the Sante Fe applications and the airport infrastructure recipes, there is no general-purpose simulation studio and suite yet. They are situational and need expertise applicable to the simulation. The result of the Santa Fe simulation initiative was the production of ten development principles for land-use planning, “which, along with the building blocks, were place-specific and responsive to Santa Fe, and could be adapted over time in a variety of simulations.” Simulation models are not as easily generalized as are spreadsheets. Equally, though, the building block innovation path of simulation suite development is cumulative. Building blocks in one model become part of a library for use in other models in the same general domain. In some fields, there is now a rich set of building block resources that is making simulation modeling an everyday tool for engineers, planners and designers. Examples are construction, urban planning, car manufacturing, and transportation. What makes these particularly amenable to Decision Enhancement methods and suites is that they model physical phenomena and systems. That means that the building blocks are well-defined, and visual images are often both widely available and immediately meaningful and useful.
Shifting simulation modeling from solutions to rehearsal Decision Enhancement is a “lens”, a way of looking at the world of decision-making. Perhaps the single most central component of that
132
viewpoint is the opportunity to use simulation to “rehearse” futures. This is a somewhat different lens than that of traditional simulation modeling, a long-established field, which tends to focus on the link between a phenomenon to be modeled and the accuracy of the model that simulates its behavior. This perspective is fully appropriate in many contexts. For example, it has transformed the nature of manufacturing design; specifications for, say a car, printer, computer or airplane component are converted to a model that permits testing the implications of stress, heat or operational conditions. The models are very much analogous to spreadsheets, in themselves a form of simulation; they show the dynamic behavior of the system – a rehearsing of possible futures. Traditional simulation methods generally aim at using the power of the model to generate an optimal solution. This can be highly effective, as this extract from an article indicates – but it contains yet another factor very likely to discourage executive decision makers from trusting a model: “Through dynamic optimization, companies can quickly find new, successful solutions to operational problems….. Southwest Airlines was able to optimize its cargo activities. Some results were counterintuitive; [our emphasis is added here] under certain conditions, the recommended strategy was to put a package on a flight headed in the “wrong geographic direction.”78 The catch here is that it is asking a lot of any group of decision makers to place their trust in a model that produces counterintuitive recommendations. In the Southwest example, that trust was probably built by exhaustive testing and experience. The problem is well-defined, the rules and relationship among the variables are well-understood and the success of the model proven. But the article moves on to discuss a modeling application that we would classify as having all the characteristics of decisions that matter: designing a new organizational structure for an oil and gas company and transforming a pharmaceutical firm’s entire drug development model. It highlights many of the points made in our discussion of decision processes in Chapter 3, that human intuition cannot predict the collective dynamics of systems containing thousands of parameters. The article concludes: “Most people trust the computers that manage an automated teller machine or fly airplanes. My dream is that people will also learn to trust strategic decisions 78
http://www.simcenter.org/Projects/Santa_Fe/body_santa_fe.html
133
made by programs: eventually they have no other choice.” The point is a very valid one, but no one can order people to trust something. They trust people and processes. DE simulations use many of the same software and development tools of standard simulation modeling. Where they differ from that standard practice reflects the DE lens: the focus on where and how the models can enhance the decision process and fit into the Decision Enhancement studio. Returning to our distinction between usefulness, usability and usage, simulation is proven in its usefulness but its usability has been limited in many instances to specialists and experts – a simulation model for airplane component design is not intended for non-engineers to interact with, for example. Since simulation models can demand massive computer processing power, they are generally built to run as efficiently as possible and with limited flexibility. The DE lens on simulation modeling is highly focused. Its main principles are shown below. They have been illustrated in both the Hartsmartin studio and suite and in several of the case vignettes we provided at the end of chapters. The AGV and DoD supply chain management portal are examples. The principles underlying all these varied simulations are: ¾To make the studio process effective, decisions that matter need dynamic models that communicate their results easily and quickly with participants. That requirement poses significant challenges in design, implementation and operation and pushes toward the visual tools to enhance visual thinking that we discussed in our previous chapter. Simulation models built for experts are valuable in providing consulting, advice and analysis but they too often remain outside the decision process. ¾Dynamic models should drive all the other tools used in a DE suite, including access to databases, forecasts, reports and planning aids. The model brings these to life and gives them context. It is the infrastructure for the suite. ¾The DE priority is to help actors and stakeholders in the process get insight into the situation and into the impacts of possible changes. While the effects of such changes are most difficult to assess in decision arenas that reflect our characterization of decisions that matter – urgent, consequential, non-reversible, uncertain and wicked – these are the ones that are most enhanced by appropriate
134
simulations for the reason that without them there is no base for systematic collaborative exploration of options and impacts. ¾Decisions that matter almost invariably take place in a multi-actor, multi-stakeholder context. (Actors are directly involved in the decision process and stakeholders are affected by or can influence its outcome). In many instances stakeholders are also actors, but not always; in the Hartsmartin airport development project, to date senior government officials are stakeholders but not actors. When decisions are made, they need to be based on agreed-on facts and explicit models. Otherwise, there is little basis for collaboration and there will inevitably be arguments about whose data is “correct” and disagreements about interpretations of the situation. The performance metrics in terms of which the impacts of changes are evaluated, the visualizations and the levels of detail of analysis can and should vary, since decisionmakers each have differing skills, interests and viewpoints. But the scenarios that drive the studio process should be consistent and based on the same core model. Current simulation practice has delivered many effective and efficient tools for meeting the technical demands for implementing many of these principles but with a major exception; they do not yet fit into the “multi“ elements of DES: multi-actor, multi-representation, multi-context, multi-value, etc. The models are largely monolithic and single-actor; an optimization model for instance aims at maximizing “the” solution according to a pre-specified “goal function.” In addition, the technology base on which the most established simulation tools are built makes it very difficult to assemble models in zero-latency time – fast enough to keep pace with the dynamic and evolving decision process. The DE services contribution to decision agility rests on meeting these challenges. Progress has been rapid. The first requirement is that simulation tools used in a suite and to drive other technology components must be dynamic, easy to assemble, use, change and reuse. Next, they must be explicitly designed for multi-user, multi-level development and use. They should enable remote access via telecommunications links including the Web and link parties wherever they may be located across the world. While the ideal toolset is still some years away from becoming widely available and easily deployable, our colleagues in our DE work have
135
shown just how much can be achieved. The AGV, Hartsman and supply chain management portal are examples. These technical achievements enhance every dimension of decision process agility. They add speed; scenarios can be developed, explored and evaluated, sometimes in hours. The technical architecture on which they are based enables a new degree of flexibility for the suite and the studio and hence for the overall decision process. The very foundation of the simulation design facilitates coordination – of actors, locations, information and studio processes. All these in turn facilitate collaboration. And they encourage innovation – new rehearsals of new futures.
Simulation as a Key to Innovation Here are two comments that capture the pluses and minuses of simulation as an enabler of innovation and that help shape the message to decision-makers about why they should consider adopting them as part of their own decision processes: “With simulation models we are free to imagine how an existing system might perform if altered or imagine, and explicitly visualize, how a new system might behave before the prototype is even completed. The ability to easily construct and execute models and to generate statistics and animations has always been one of the main attractions of simulation software.….. Visualization adds impact to simulation output and often enhances the credibility of the model – and not just for a non-technical audience.”79 “Simulation remains, despite some notable exceptions, an occasional activity performed by a single engineer who is not tied into the corporate mainstream of business planning……. It seems that few technologies could be as important to business and governments as ones that predict the future. Yet simulation today does basically what it did ten years ago….. Perhaps hardware advances haven’t provided the juice needed to take simulation to the next level.” Our own comment here is that in those ten years every aspect of computer performance, from 79
Swain, J.A., Power Tools for Visualization and Decision-making, ORMS Today, February 2001, www.lionhrtpub.com.
136
hardware to software to communications and data resources, has improved exponentially, so that it is clear that there is no technical determinism involved in automatically moving advances in building useful and usable simulations to getting them actually used.80 The author of the second quotation above believes that problem is to find the “right software and hardware that will turn opportunity into reality.” We do not agree with that assertion but we do share the perspective expressed in both these quotations. Simulation is the very core of DE services. Any major decision that meets our stated criteria for decisions that matter – urgent, consequential, non-reversible, packed with uncertainty and “wicked” – surely must draw on appropriate modeling tools for rehearsing possible futures. Just as it makes no sense for a bridge builder not to simulate the impacts of and on designs – materials, weather, traffic loads, etc. – it equally makes no sense to make key decisions on, say, e-commerce strategy without simulations. We choose e-commerce as our example here, because so many dot coms relied on analysts’ forecasts of the B2B market being $X trillion by 2002, and experts predicting the death of bricks and mortar, and plenty of spreadsheet budgets and forecasts. But in our own experience with ecommerce firms, there were no simulations, no DE studios. These would have helped executives structure their thinking, turn the floods of information, claims, predictions and business “models” into a systematic process, and let them explore and argue about assumptions and strategies. Why did we ourselves as advisers to top management and Advisory Board members in many of these companies not suggest they should use simulation? Because it would have been a waste of time. Simulation is not part of the executive culture, except for spreadsheets. Spreadsheets are not really simulations for exploring possible futures, but extrapolations. Load them with optimistic scenarios and they will produce optimistic results. The merits of simulations as a business way of thinking innovatively are well-established. Here is a comment that captures the opportunity: “Simulation is the key to predicting the unpredictable. One cannot write an equation or create a formula to solve today’s global market phenomena. But one can design a proactive computer model to capture the interactivity of a market in which every aspect of it changes each time a participant makes a move and each change is different from the last one. Simulation captures markets 80
Hicks, D., Simulation Needs a Jump-start, IEEE Solutions, October, 1999, page 18.
137
interactively because it is active, not passive. It’s faster, it’s cheaper, and it’s safer. If the computer model doesn’t compute, so what? No one gets hurt. You simply start again. Simulation helps you fail creatively, fail unexpectedly and, possibly the best of all, fail often.”81 The author of this quotation, Michael Schrage, is a journalist and peripatetic voyager across the worlds of academia, media and high tech. He has written two influential books, both of which are highly relevant to Decision Enhancement, particularly Creative Play (2002) which views simulation as a way of life in business, science and the arts and prototyping as core to innovation and design. Here are a few of his comments, which are supported with many examples: ¾“The best demos let us interact with each other… I discovered that the key to successful collaboration was the creation of ‘shared space.’ The notion that more or better communication was the essential ingredient of collaboration was false; what was needed was a fundamentally different type of communication.” ¾“I wasn’t playing with ideas. I was playing with representations of ideas. The notion of talking about ‘ideas’ and ‘innovations’ divorced from the forms that embodied them struck me as absurd.” ¾Fuzzy mental models become “externalized thoughts” through demos, prototypes and simulations: “In order to have actionable meaning, the fuzzy mental models in topmanagement minds must ultimately be externalized in representations the organization can grasp…. Prototypes engage the organization’s thinking in the explicit – they engage the social senses.” ¾“As technology transforms the economics of modeling media, it will increasingly force firms to reconsider their economic choices and behaviors. As prototypes become ever more powerful and persuasive, they will compel new intensities of introspection…. We have opportunities now to model future worlds swiftly, cheaply and creatively – rehearsing the future.” (Thank you, Michael.)
81
Ken Karakotsios, Getting Real With Simulation, Center for Business Innovation, cbi.cgey.com/journal/issue4
138
Coming back to survey that we cited at the opening of this chapter, shifting executive opinion as to the relevance of simulation modeling to their own decision processes rests on getting across the difference between modeling as solutions and as rehearsal. And, of course, the DE studio must use them for this end.
Designing DE Simulation Architectures and Building Blocks A key reason for designing DE simulation-based suites as a set of building blocks is to make sure that the simulation adapts to the decision process and decision-makers, instead of requiring them to adapt to it. We believe that this need – and the current general lack of modeling tools to meet it – underlies the response of the majority of the executives in the survey we discussed earlier about the use of Decision Support Systems who fall into the category of “Yes, but” and “Possibly”. Implicitly, for decisions that matter, they want to retain the opportunity to be creative and to innovate; they see the very structure and discipline of a computer model as a blockage to this. To exploit the opportunity of simulation as a vehicle for innovation and creativity – the two go together – DE designers have thus had to find ways to solve the problem of the monolithic model and modeling base – the software “engines” on which specific simulations are built. What is needed is to break all elements of DE simulations into building blocks that can be rapidly built, assembled, combined and re-assembled. Traditional simulation models do not need this capability; indeed, it is often most efficient for them to retain their solid base of proven software engines and reliable operations. While it then takes time to develop new scenarios, these can be evolved at leisure, since they are part of a “study.” By contrast, when they are used in a DE studio, they are dynamic elements in an interactive process where decision process agility rests on comparable agility features in the simulation modeling process. The rapidly emerging mainstream of developments in information technology facilitates the use of building blocks. The Internet is now far more than a network but a comprehensive and growing set of “Web Services”. These are tools and standardized interfaces that can connect “objects” and “agents” – data and software routines – that have the following main characteristics.82 They are: 82 Fernando J. Barros, Axel Lehmann, Peter Liggesmeyer, Alexander Verbraeck and Bernhard P. Zeigler, Component-Based Modeling and Simulation, Dagstuhl Seminar Proceedings 04041 (http://drops.dagstuhl.de/opus/volltexte/2006/457)
139
¾Self-contained: They are literally building blocks. They include within them all the information needed for them to carry out their routine once they are activated by a message that provides needed data parameters. ¾Interoperable: The blocks fit together “seamlessly” across models and libraries of software and data resources. It is only within the past few years that the technology underlying the Internet has led to an emerging standardization of how individual technology elements link together, so that they can be reused across applications. (The generic term for these capabilities is Web services; these are the fruition of a forty-year effort to make IT modular. XML is as important in this regard as Windows was to PCs.) ¾Reusable: This is the key to the building block approach to suite design and use: the ability to build up a resource base of components that are proven, self-contained and reusable. Instead of having to code new software for each situation and go through the process of test, test and test again before it can be released for use, the very nature of a studio and interactive decision process rests on being able to mobilize resources quickly. Ideally, new models should be as easy to build and change as spreadsheets are. ¾Replaceable: The logic of building blocks includes substituting, say, one routine that displays results as animation for another that generates static graphs or replace a software block that calculates the passenger by passenger flow of traffic through an airport with one that uses statistical averages and standard deviations to capture aggregate system behavior and save time and computation – or the reverse. ¾Customizable: While building blocks are the base for a suite, the specific model will always need some degree of customization, especially in terms of ensuring appropriate visual displays and ease of use. Very roughly, building blocks are the foundation for the useful tools in a suite; adding usability and generating usage will require constant “tweaking” and in many instances specialpurpose adaptation of generic building blocks.
140
Meeting each of these requirements is in itself a design and implementation challenge; fitting all the pieces together pushes the state of best practice. Almost all standard, large-scale simulation software “engines” – the core systems that handle the computation load of the modeling – were of necessity built on very different principles. The solution to this problem is to define distributed information and communication architectures for distributed organizational processes. The key term here is “distributed.” The technology is distributed across sites and systems, meaning that it must be assembled as a coordinated set of building blocks. From the DE perspective, the actors are distributed, too, so that the technology suite must be brought to them in their studio locations on the same basis. Within a distributed architecture, modeling, analysis, design, simulation, and the operation of different aspects of distributed processes are coordinated by using consistent sets of components or building blocks per decision domain. A domain is a well-defined area of application; transportation infrastructure development is one domain and the evolution of the Hartsmartin studio and suite has seen the development of libraries of building blocks specific to that domain. By distributed architecture, we mean that the building block components may be located anywhere, some on central legacy systems that provide data input for the simulation, some on Web sites, and some on servers across the company and relevant business partners’ systems. The architecture is the framework and set of tools for ensuring that all these elements work together automatically and directly, so that when there is a change to the simulation or a need for new data, or when the model is accessed from new locations, no programming is required, with all the resulting costs, complexity and time delays that that inevitably involves. If a simulation model is to be used by experts to produce recommendations, then it need not be distributed in its technology. It can be built and operated on a single site. Additional data inputs can be extracted from other systems as needed and links to other systems can be programmed in as needed. This makes the entire technical aspects of simulation modeling much simpler. But it also means that updating models is slow, so that there is little interactive opportunity – that is, they won’t work in a DE studio of teams collaborating in real-time. It also means that most of the software will be in a single package maintained on a single hardware resource; this makes it easier to secure, maintain and operate. But is also means that large simulation models accessed from multiple locations create bottlenecks in telecommunications throughput. When a new scenario is run, the entire simulation must churn through its operations.
141
To a considerable degree, the progress of DE simulation suites will rest on how to distribute simulation components so that they can be assembled on the fly. That is because interaction is at the core of DE studio processes. Building blocks at the technology level exploit the emerging mainstream of information technology to break large simulation capabilities into a set of small pieces. Perhaps more important for the future of DE, they also make it easier to incorporate organizational processes into the simulation suite design and studio. In most areas of decision-making in decisions that matter, stakeholders and participants are distributed. Therefore, the technology should facilitate corresponding distributed development, access and usage of distributed information resources. The Hartsmartin case in Chapter 3 is typical. The parties who will directly affect or be impacted by the decision are scattered across many organizations. One of the main problems in institutionalizing the use of Group Support Systems – face-to-face early DE services – is that they are face-to-face but address distributed processes that involve distributed actors. The building block approach to simulation design facilitates a network approach to all elements of DE simulation – distributed suites, distributed actors and distributed processes. There is a long way to go in this regard and the early pioneers are spending a great deal of money to get there. The result is what may be termed a decision “net,” the combination of distributed tools, actors and processes. Later in this chapter we provide a vignette that briefly describes Toyota’s commitment of around a billion dollars to create an ability to model “everything” in its design and manufacturing. What is most apparent about its new architecture is that the billion dollars is basically being spent to enable distributed actors, processes and technology tools.
The Building Blocks of the Hartsmartin Suite The core of the Hartsmartin airport development suite is a common database to which the tools link and which continuous exchanges their results through the database – it operates as a central record-keeper and integrator. This differs significantly from the conventional approach to modeling. The standard way of using a model is to “run” the simulation or spreadsheet, look at the results, make changes and run it again. This takes time and effort in manually adapting the data and often, for simulations, requires some additional programming. The basic building block for the simulation is the passenger. Passengers move through airports, interacting with airport processes, which can
142
also be represented as building blocks. Passengers shop, go through customs, check in, eat and drink, and they do so in many ways that are not predictable; they are free agents in much of their behavior. All these agents and processes can be defined together with the business rules for their interactions – passengers can walk only at an average speed of X, immigration staff can handle up to Y passport inspections an hour, and so on. Building blocks link the simulation software and data – technically termed objects or components – to how decision-makers think about the airport. Here, the choice of representation is the passenger, but for other airport planning decision situations, it might just as easily be an airplane flight, customs and immigration processes or maintenance facilities. These could draw on the passenger building blocks as part of a library of simulation components. The choice of the passenger representation was made to provide shared understanding among the parties in the decision process; it corresponds to how they view the dynamics of airport operations in relation to the decision that matters: capacity. They can “see” the flow of passenger traffic through the airport via animation, and choose among displays of graphs, numbers, animations and statistics. From a technical viewpoint, building blocks resolve many problems encountered in simulation modeling. For example, a suite can be developed with many of the blocks pre-built to capture processes common to the decision domain of airport traffic flows. They can be distributed across many computer servers and linked together dynamically. This speeds up development and also speeds up operations: the centralized nature of traditional simulation modeling tools creates run-time bottlenecks. The Hartsmartin simulation is just one of a number of projects underway that apply the building block strategy. The first of these was developed for Schiphol airport. Figure 6.3 below shows the high-level building blocks plus the static display of an animation that is the result of a simulation run. There are four types of building block in the Schiphol simulation: infrastructure, passenger or group, passenger behavior, and control building blocks. Defining the building blocks requires “domain expertise” – a polite way of saying the design team had better understand airports in detail. Here is an example from a paper that reviews the building block approach and its application to Schiphol:
143
Figure 6.3: Two representations based on the building blocks83
“We decided not to model individual passengers, but groups of persons instead. When studying airport processes, it is clear that a family moves in a different way through an airport than a single person. Take check-in for example. The family checks in as a group. They also have the same speed when moving through the area. They only enter the next area if they can all enter. For some individual processes – for example customs or one family member who goes shopping alone – the group might be temporarily split, and recombined later. Please note also that not all persons who are present at an airport are passengers. Crew members and support personnel also use the areas, and can be modeled just like the passengers [our emphasis added]. Passengers are often accompanied by friends or relatives when they are arriving or departing. These persons also use the infrastructure and should be modeled to get a clear indication of infrastructure use.”84 The key issue for DE services here is that the “real world” actors and their interactions are embodied in the simulation as building blocks. Their behavior is defined by “scripts” of processes, some probabilistic – “on average…” and others highly procedural – “when checking in, the passenger group must……” There is thus a two-way mapping between
83
Taken from a TUD presentation, Application of DSSNG in Airport System Planning by Uta Kohse, 4/7/2003 84 Verbraeck, A. and Valentin, E. (2002) Simulation Building Blocks for Airport Terminal Modeling. Simulation Conference, Vol. 2, Issue 8-11 Dec 2002, pp. 1199-1206.
144
the world and language of the decision-makers and that of the suite designers. And also between that of both players and the studio process facilitator. In a later example of DE in action, we discuss the development by some of our colleagues of building blocks for facilitating and structuring collaborative group decision processes. The same principles apply: think lego blocks. Creating an architecture that can define and integrate these lego blocks in a suite is not a simple exercise. That said, every shift in the information technology field facilitates it. Without going into too much technical jargon, we point to some of the tools that make it absolutely guaranteed that the building block approach to suites and their use in studios is the path for DES to take. Technology architectures are becoming “component-based”, “open” and “standardized.” They are also moving towards fully “distributed’ technology platforms. Such languages as Java, standards for linking any type of data or document to any other and to building blocks that carry out some process – calculations, displays, simulation activities, etc. – such as SOAP and XML extend the building block opportunities. The Schiphol suite reflects the rapid progress in simulation development that these components answer what is loosely termed Web services has helped enable. Initial working versions of models are now built in two weeks. Updates can be made in days. New building blocks can be added almost as fast as they can be defined.
Conclusion: Get the Technology Right! None of the executives, policy makers and operations planners involved in the Hartsmartin decision process need ever be aware of the specific technical elements of the simulation model. That is the point. The technology should adapt to them, not the reverse. Just as the Web has become an interface to an ever-expanding range of information resources that most executives are ready to trust and, depending on their interests, attitudes and modes of work, to use, the building block strategy for developing simulation models provides them with the same natural interface – natural in the sense that it maps into how they think and perceive the decision situation. In the opening of this chapter we laid out three principles for helping make simulation modeling part of the executive culture. The first, identify and target decisions that matter, is technology-independent but makes appropriate technology relevant. Addressing decisions that matter is part of strategic planning and executive responsibility.
145
The second principle, make the studio the process base, is also technology-independent. That is, a task force supported by a skilled facilitator can achieve innovation and collaboration; indeed, most companies invest in leadership training programs and change management workshops that encourage innovation and collaboration. The vignettes of Group Support Systems that we have presented in earlier chapters of Decision Enhancement Services show how much can be achieved by a Group Support System with very limited technology, mostly software for managing and tracking team interactions. So, it is the third principle that expands the space of possibilities from decision-making to decision support to decision enhancement: get the technology right.
146
Vignette 8: Toyota’s Decision Modeling Net Toyota announced in mid-2002 that it was committing $800 million to $1.2 billion to a massive IT system that will model every aspect of car production from concept to operations, from the automobile’s look to the parts that make it run, from the sequence in which components are assembled to the design of the factory itself.85 This requires a very complex suite of tools, many of which remove decision-making and automate processes and procedures. Examples are line-production scheduling and component ordering. The system suite includes tools for design collaboration, testing designs for “manufacturability”, enabling reuse of parts in different models, and production support. Some of the priorities for the new capabilities emphasize collaboration and coordination, such as bringing suppliers into the design process and “converse engineering.” Traditional industry practice has engineers decide on the details of a car before it goes into production, right down to every single part. The design is then sent to production for prototyping. That is a sequential chain. The new process will let other teams of product engineers design parts not key to the car’s styling later on. Alternators are an instance. “Instead of going from concept to design and speaking of downstream processes like manufacturing, Toyota starts with the idea of manufacturing efficiency and works back towards the concept and design.” This enables reuse of parts, such as a hood, as well as speeding up time to market. Engineers will be able to search a library of existing hoods and use the software tools to make changes to the exact shape and contours. They can then automatically test it for manufacturability. As design and production engineers work on the look of a new model, separate engineering teams will create a plan which specifies the order in which parts are to be installed as the car moves down the production line. That plan will then be used for software to digitally model the entire factory layout, the assembly steps, the number of people to be stationed at each assembly stop and what their tasks are.
85 Steve Konicki, Revving Up: No. 4 Carmaker Toyota Hopes To Leapfrog Competitors With Digital-Modeling Software That Links Production And Demand, InformationWeek, April 1, 2002
147
Time to market is a priority for all car makers. Toyota has cut this for new models of existing cars, such as the Corolla, from 3-4 years to one. The goal for new models is now 10 months. This is essential to respond to the many car buyers who regard a car as a fashion item and look for new styling, concepts and features, “relying on many of the same impulses that guide the way they buy clothes or electronics.” An InformationWeek article emphasizes the automation elements of the integrated system, as do most discussions of such design and manufacturing innovations. It quotes a DaimlerChrysler executive: “It’s a good thing to automate the design of a car, but it’s a real good thing to automate design and planning of the production process and tooling and equipment.”86 Taken literally, that raises the question of why bother to hire engineers? Let the software make all the decisions. In practice, of course, the skills, and collaborations of hundreds of teams and thousands of engineers, planners, executives and business partners are essential and the system leverages and supports them but does not replace them. Equally obviously, viewed in DE terms, the integrated software takes away much of the burden of their tasks, provides a degree of coordination via a decision net that would be impossible otherwise, and adds simulation, information management, analytic and expertise embedded in software. The article points out that other companies are working towards the same goal as Toyota, out of competitive necessity as much as competitive opportunity, and are adopting and adapting the very same software. Surely, though, the winners will be the ones that combine the best combination of human culture and technology leverage.
86
http://www.informationweek.com/story/IWK20020329S0043
148
Vignette 9: A Simulation for Call Center Management A 1999 article entitled Decision Support for Call Center Management Simulation parallels our own thinking about how simulation modeling should be used to enhance decision process agility. We are not familiar with the author, who works at AT&T Labs, and review his case example for that very reason. He comes to many of the same principles for Decision Enhancement as we do, albeit by a different path. We see an almost natural convergence of methods and tools towards the principles of Decision Enhancement. If you see modeling as a valuable tool, share the core concepts of Decision Support, and want to apply your knowledge and skills on real-world problems, you will end up building a studio for collaboration and a suite for rehearsing futures. You will also identify visual thinking as a key value created by the use of the simulation model. Mr. Chokshi reports on AT&T’s Business Services Division’s 1998 Reengineering Initiative that spanned “numerous” processes and systems across functions and organizations. The goal was to take a futuristic view – five years out – of AT&T’s customer care. That time frame reflected when capabilities created today would be fully implemented. The Reengineering Initiative (RI) was broken down into teams, one of which was assigned to “reengineer and design a seamless end-to-end process view” for the customer. That was about as broad a task as anyone in the company could be asked to take on. It had to include sales and ordering, provisioning, maintenance, billing and collections and customer support. The team was challenged to drive “tremendous systems and capabilities change.” The core of the team was Subject Matter Experts from each business area. The goal is to push the boundaries of ideas and challenge the status quo. A major contribution of simulation here is to identify “to be” processes rather than model “as is” ones. The team worked from a business integration architecture blueprint that laid out the company strategy and vision. The goal was to bring all processes together at a single service oversight and coordination point, the customer call center. Much of customer care is handled by traffic over the Internet, in work centers and in a combination of electronic and manual functions. The process complexity and fragmentation blocked what has long been the
149
goal – and delayed promise – of telecommunications service providers: real-time provisioning. The path to cutting through the complexity was through a simulation model that provided an organizing vehicle for moving towards a decision on the “to-be” process design. Here are the stated objectives for the simulation: ¾Help visualize futuristic processes and use the simulation as a communication tool ¾Validate assumptions of processes prior to implementation ¾Analyze the end-to-end impact of change ¾Predict high-level resource requirements ¾Perform what-if analysis ¾Estimate cost savings for the business case purpose Each module in the simulation architecture consists of detailed process activities, including estimates for work allocation and rework needs (in telecommunications services, alas, it is very frequent that the technician finds some unexpected wiring problem or other cause of delay and extra scheduling). The main inputs to the model are the estimated patterns of customer requests, based on historical data, the mix of service activities, resource availability (a key focus for decision-making on changing the process) and process times. Obviously, the modules are complex and detailed. A customer request ranges over many services with all the implications for service time, process design and cost; examples are billing inquiries, sales information request, maintenance requests and many others. The simulation capability was built very much as a suite, with an emphasis on modular design, so that an assumption can be quickly changed in one module and the “what-if” implications tested out without having to change any of the rest of the model. The author of the article provides a few specific examples of the use of the simulation; he states that because of “proprietary information laws” he could not go into much detail. Two areas of analysis were electronic care (ECare) and sales automation (SA). “It is intuitive to feel that electronic care would have significant impact on savings. But how much? Is the question one that the simulation can answer? In the mode of inquiry that a simulation-based studio encourages, the team were able
150
to test the impacts of varying levels of electronic care, from 20% of customer requests being handled by Internet access and internal information and processing systems, to up to 50%. The savings were significant as the percent increased. By contrast, increasing sales automation did not point to savings, which was somewhat of a surprise to the team. Such surprises are one of the values that a simulation can provide in that it does not confirm conventional wisdom but generates exploration and evaluation. Sales automation deployment moved down the prioritization agenda to “probably last on the list.” The simulation led to real results. Mr. Chokshi lists the key “takeaways.” The team was extremely happy to have a tool that helped them visualize processes. The results of the model were the key basis for the business case that the group made for investment and action. The scenario analysis helped them understand the implications of potential decisions and design a revised implementation strategy and prioritization. “The Animated model helped as a wonderful communication tool to the senior management and other organizations, especially software development outfits. The main reason why development outfits liked it was because they can better understand information flows.” The team has moved on to the implementation stage, with the simulation being used to model subprocesses and lower level systems. “Unleashing the power of simulation as a decision support technique continues to be a source of pride and accomplishment” for the team. The listing of “takeaways” ends with a comment that subtly points to the difference between a simulation as model and a simulation as part of a studio. “It is very important that the team, stakeholders, and the senior management understand the scientific basis behind the results and the analysis. Otherwise, it is very easy to be known as the psychic reader with the crystal ball who can predict the future. Simulation loses its position as soon as it is being viewed as a crystal ball. It is the simulation team’s responsibility to educate other members.” [Our own emphasis is added here] Our own conclusion from this and comparable successful uses of simulation to enhance a decision process rather than generate expert answers is that all model builders who want to make an impact through their services should ask themselves the question “What is the studio design in which our simulation will be deployed?”
151
Chapter 7
Coming to a Decision: Studios: People plus Methods plus Suites Overview Getting the technology right, where “right” means matched to the decision-maker and decision process via an architecture based on distributed building blocks, provides the foundation on which to develop DE studios. Studios are the embodiment of Decision Enhancement services; systems are not enough. Studios rest on facilitative skills and, in most though not all instances, on technology suites. The risk management application discussed in Vignette 2 is very much low-tech, for instance, with software that basically keeps track of and organizes ideas and discussion. By contrast, in many studios, a simulation suite is central to building inquiry and exploration of new options. The AT&T customer care vignette that we presented at the end of the last chapter shows the degree to which a simulation model can be made the base for an entirely new way of looking at a decision situation, communicating and reaching commitment. Many of our colleagues in fields relevant to Decision Enhancement, such as computer science, simulation and multi-criteria decision-making, generally view the systems they create as the service that they offer to decision-makers. They do not pay much attention to the “soft” skills of facilitation. Where facilitation is little more than a form for consensusbuilding, their criticisms of its lacking a decision focus have some justification. We argue strongly, though, that the studio really provides the service and that the effectiveness of the system depends on the quality of the studio in which it is housed. That means that studio design is fundamental, not peripheral, to the practice of Decision Enhancement. The general components of studio development are: ¾Landscaping: define the decision context, stakeholders and governance rules for the decision process
152
¾Orientation and initiation: ensure a team with the skills, credibility and domain expertise to attract, motivate, coordinate and help the studio participants move to a decision commitment ¾Recipes: apply wherever possible proven recipes that include effective scripts ¾Suites: ensure that tools are designed and implemented within an overall distributed architecture ¾Process: make commitment to a decision the explicit target and agenda. It goes without saying that most applications of information technologybased “systems” are nowhere near being close to every one of these requirements. That is why so many firms’ investments in expensive CRM capabilities have proven so disappointing, why DSS – Group DSS or model-based DSS – so often fail to become institutionalized, and why so many research, consulting, lab and pilot applications of useful analytic methods and software tools have so little real impact on actual decisions that matter. Looking back at hundreds of initiatives, thousands of articles, and an equal myriad of case studies, pilots, and vendor reports on the design of technology-based aids to decision-making, our firm conclusion is that the studio is the absolutely critical resource to turn suites into payoff. It is also the foundation for shaping the decision process at the very start. The choice of orientation is thus a strategic move that affects the nature of the collaboration, the content of the analysis and the value of the suites. We see distinct emerging patterns in the recipes for Decision Enhancement studio design as our colleagues learn from each other’s experiences. There are three main types of studio, which we term Learning, Inquiry and Participatory. The boundaries between the three are not fixed but each effective studio has one of these three styles and resulting scripts: ¾Learning studios are intended to help participants build a new understanding that leads to a new sense of options and process. This is closely linked to gaming and many of the suites developed in learning studios become vehicles for training. The Navy supply chain management portal we summarized in Vignette 3 in Chapter 1 is an example; it became a learning game.
153
¾Inquiry studios are more prescriptive in their focus and style. The goal is to foster critical inquiry, challenge assumptions and seek out original directions for the process. The Santa Fe studio for urban planning was centered on this. ¾Participatory studios are much more invitational and aim at encouraging the involvement in the process that is most likely to lead to consensus, agreement and commitment. The main goal of the Hartsman studio was to foster such an environment. Words that go with each of these studios are, for Learning experiment and review, for Inquiry challenge and debate, and for Participatory encourage and agree. We end this chapter with three case vignette examples of each style of studio.
154
Vignette 10: A Learning Studio for Coordinating Distributed Work A highly effective Learning studio was built as a simulation game to help neighborhood teams in the 6,000 person Amsterdam police force begin to design new modes of coordination of their distributed work. The four year project used the game simulation for the police to evolve their own decision principles for new modes of tackling what has become a growing problem for them: the conflict between necessary division of functions and expertise (differentiation and specialization) and equally necessary coordination of interdependencies (integration). This is a challenge for just about every large organization and the study87 provides comparisons with other case examples that include hospital clinical care in radiology departments and hospital ward rooms, consulting and after sales service to customers in a software firm, and coordination of projects in banking. In each instance, the work involves more and more distribution of people and skills. The researcher cites figures that show that in over half of all projects surveyed at least one key participant in the decision process is not co-located with the main team. The complexity of the work environment leads to specialized skills and roles, with a constant challenge of how to fit the very different pieces together. In the police force situation, “The whole has become so large, differentiated and complex that it is impossible for one actor or unit to grasp all relevant detail and or all interdependencies.” The simulation game was developed for the distributed actors to experiment with how they accessed expertise in handling “assignments” – emergency problems demanding fast action in their own neighborhood or area of responsibility. The specialized work is organized by task and function. For instance, Youth Annoyance and Firearms are two highly different units. Add to these specialized functions for handling drug crimes, “car criminality”, burglary, traffic, social services and many others, all spread across 32 neighborhood teams of 60-80 policemen and policewomen, in 8 districts and the coordination problems are close to becoming unmanageable. Decentralization and neighborhood localization is a key 87
Joeri van Laere, “Coordinating Distributed Work: Exploring Situated Coordination with Gaming-simulation”, Delft University of Technology, 2003.
155
principle of effective urban police work; hence the neighborhood teams as the unit of organization. But criminals do not conveniently confine their work to one neighborhood, a single incident may involve drugs, guns, social services, burglary, etc. The police force tried hard to coordinate all this through standard methods: task forces, meetings, the assignment of six Regional Project Coordinators and new work procedures. Ad hoc initiatives included offering special training programs to inexperienced lower level police officers to learn how to communicate with one of the immigrant groups that is a frequent source of community tension in relationships; in one district, “Youth rules the street.” Accordingly, the police officers and managers just do not attend meetings outside their district. When a simple booking turned into a major gang fight and a “tense weekend”, there was a ”commotion” and widespread dissatisfaction with how the police force management coordinated the situation. The study provides a list of coordination problems – which are really decision process problems in that immediate response is essential to deal with the situation: lack of attendance at scheduled meetings and very little commitment to the proceedings and “decisions”, “pigeon holing” of expertise, unclear expectations and wasting of time in formulating agendas, long throughput time, minimal linkages between individual and group actions, poorly structured meetings, laborious communication and many other issues. This list may appear to suggest that this is a poor organization, but in fact it has a good track record and is highly professional. It has a process problem. It needed a process solution. Enter the simulation team. Instead of “solving” the problem, it empowered the actors – the people who do the work – to learn how to think about solving it. This contrasts with the obvious expert approach, which is to apply information technology to the situation and use the police force’s intranet and e-mail services to improve communication, strengthen the role of the Regional Project Coordinators, formalize the meetings and impose policies to ensure that this time District 6 attends, provide more training, and so on. There are some innovative possibilities to explore, such as providing additional wireless communication systems – it is fairly common to see a police officer on patrol in the city on a bicycle that has a laptop strapped on the back. The team leader, who had worked with the Amsterdam Police Force for some years, suggested a simulation game be developed. He saw the main issue as how teams can search for and make use of know-how outside their own face-to-face environment. This is very much a
156
personal choice. Police officers may decide to rely on their own expertise, which is very much part of police work across the world, with local knowledge, contacts and experience a closely-preserved premium. With the simulation game, they instead search for information and help on the police force’s intranet, contact a knowledge center, such as the regional project coordinators’ offices and service and administrative personnel, go through their own list of known experts and make direct contact with them, or post their problem on a bulletin board via e-mail88. The game allowed members of neighborhood teams to test out the options. The simulation game administrators designed “assignments” – problems requiring decisions and actions – that were sent to the players. The game sessions took 1 to 1½ days, with a preliminary orientation, a first playing cycle followed by a joint review, and a second playing cycle. The game was prototyped in 1999 and implemented nine times in 2000-2001. The average number of assignments that were solved per cycle was 30, with high variability in both numbers and complexity across districts and type of problem. The game was as distributed as the work is, with several central game locations and small cubicles available in district stations and offices. Jurors evaluated the quality of performance in solving the assignment and in how the work was coordinated. The information was fed back to the players, of course, with plenty of opportunity for discussion. The game was a success and interest high. The results were striking and justified the goal of “understand before design.” The players were the observers of their own experience. In 66% of the situations, they chose to rely on their own expertise. In 42%, they used the police intranet; they complained of its many limitations, including a weak search engine, but it became a valued tool in the game, which it has not been in daily operations. The help desk and bulletin board features were a bust – used in just 3% of the assignments. The explanation for this mix of choices was the drive for “lowest psychological costs.” In the police force culture, there is a risk of looking weak in asking for help – hence, the low usage of the bulletin board. Police show their vulnerabilities only to a few trusted colleagues. In terms of quality of solution, the jurors rated the use of the intranet as the most effective mode of coordination of access to expertise – 4.7 on a five point scale. The help desk scored just as high, well ahead of direct tapping into experts (4.04), reliance on one’s own expertise (3.67) and use of the bulletin board (3.42).
88
This last option sounded promising.
157
The result of this Learning studio has been that the police are now working together and with the outside advisers to design a comprehensive new approach to coordination that includes extended use of the tools provided in the simulation. They offer a sophisticated perspective on how to make all this work organizationally, psychologically and situationally. What is most striking about this studio is that it is hard to come up with a better approach to handling a very complex and long-standing problem. The studio principle here is to let actors test out the possible design through consultative and facilitative guidance. Learn by doing. Do by learning.
158
Vignette 11: An Inquiry for Urban Infrastructure Planning in Houston A recurrent theme since the 1970s in the literature on decision-making that has strongly influenced thinking about the design of information and decision-making tools has been the concept of Inquiring Systems.89 These are a way of dealing with wicked problems via a “dialectical” process. That is a complicated way of saying that there are some decision environments that are totally open-ended, lack even a clear definition of the problem formulation, involve answers that are neither true nor false, and are dominated by ethics and politics, and never really get resolved. Examples are what to do about Social Security, legalization of drugs, financing public education, tax reform, corporate governance, and business environmental responsibilities. At their extreme, wicked problems are outside the domain of Decision Enhancement and neither studios nor suites can be little more than weapons in the rhetorical debates among opponents. Proponents of Inquiry Systems argue that it is practical to influence such decision processes by recognizing that they are inherently conflict-laden. Rather than evade that reality, take it on directly and make it the basis for a dialectical process. Proponents of this approach draw on classical Western philosophy to distinguish between five modes of inquiring systems – we would term them studios rather than systems: Lebnitz, Locke, Kant, Hegel and Singer. Leibnitzian inquiry is seen as highly analytic-deductive and Lockean inductive-consensual. These are generally seen as “old thinking.” Leibnitzian inquiry systems, for example, “create knowledge by using formal logic and mathematical analysis to make inferences about cause-and-effect relationships…… decision-making procedures in Leibnitzian organizations exhibit a strict, formal, bureaucratic “by the book” approach.” Obviously, one implication of the characterization is that this leads to much of the standard toolkit and planning methods of computer science and analytic disciplines such as economics. The Lockean organization is highly empirical and focused on shared communication and information: it “clearly exemplifies the interpretative knowledge management paradigm” of data warehousing,
89
C. West Churchman, The Design of Inquiring Systems, Journal of Information Systems, March 2004.
159
groupware tools and e-mail. This is viewed as very much the world of Decision Support Systems. The Hegelian perspective introduces dialectics: the most effective way to create knowledge is to observe a debate between two diametrically opposed viewpoints with a thesis – a passionately-held position – being argued and followed by an equally intense antithesis. A third party, the “objective observer”, analyzes the debate and generates a synthesis. This approach is captured in the standard description of Jesuit debates in theology. It has been found to be an effective way of surfacing assumptions in strategic planning.90 The Kantian inquiry system adds more viewpoints, arguing that effective knowledge–building relies on developing multiple models based on the multiple viewpoints on the same data and empirical situation. The Singerian system that derives from more modern pragmatic schools of philosophy is the one viewed as most appropriate to handle wicked problems and centers on ethical judgments and the taking into account of environmental and human issues. It stresses “connectedness.” The characterizations are somewhat artificial and the choice of philosophers selective. (Where for instance is Plato, Descartes, Heidegger, Foucault, or Habermas?) If the most influential of the postmodernist philosophers are added to the typologies, there will be some truly weird recommendations for information systems or noninformation systems design and decision-making processes.) But the value of the basic framework is to highlight, first, the link between knowledge and decision, and then to address the nature of the learning organization. It is very suggestive in its implications for the choice of studio style, as the labels placed on the five modes illustrate: the Analyst (Leibnitz), Realist (Locke), Idealist (Kant), Synthesist (Hegel) and Pragmatist (Singer). This line of thinking is the base for recipes and studios proposed as a new paradigm for decision support. It is illustrated by a studio developed for urban infrastructure decision-making in the City of Houston. The focus was on the physical facilities for transportation, communication and utilities needed to keep pace with Houston’s sprawling growth.91 Of immediate concern was the supply of fresh water, wastewater and storm water processing, and maintenance of roads and streets.
90
Mason, R. and Mitroff, I. (1973) A Program of Research for Management Information Systems. Management Science, 19(5): 475-487. 91 James Courtney, et al, A Dialectic Methodology for Decision Support Systems Design, HICSS 2002
160
The studio was based on Dialectic Theory, which “begins with the Hegelian assumption that organizational entities exist in a pluralistic world of colliding events, forces or contradictory values that compete with each other for dominance and control.” Given that assumption, the infrastructure planning studio treats the decision situation as a wicked problem, and its suites and tools emphasize personal perspectives, and “ethics, aesthetics and enlightenment.” The dialectic process gathers data and builds models to capture conflicting worldviews (thesis and antithesis). A goal is to update stakeholders’ mental models and move through a series of iterations that continues until there are no conflicting assumptions. Here is how this applies in practice. The starting point for the Houston initiative was to identify all relevant stakeholders, ranging from citizens, businesses, the mayor’s office, contractors, the media, funding agencies, city departments, and many others. Semi-structured interviews based on questionnaires and guides were initiated (the process is a long one and involves a hundred interviews, of which 37 were complete at the time the article describing the project was written.) The transcriptions were used to identify the multiple perspectives of the parties. The analysis highlighted the many sources of “arguments” and “fights” on almost every single project. It clearly showed that the core issue in understanding the dynamics of the decision process was, according to the stakeholders, fundamentally a matter of “Follow the money”, not about policy or infrastructure design options. A dialectical analysis showed the depth of conflict in relationships and perceptions of other stakeholder groups. When elected officials report that they view city departments as “scavengers” of money and as padding their budget to get as many dollars as they can, then the participatory studio design that we and our colleagues chose to adopt in our own work in infrastructure planning (illustrated in the Hartsmartin case) would not be effective; our studio design assumes a collaborative environment rather than a conflicted one. The Houston design identifies the main decision factors influencing each set of stakeholders. It showed, for instance, that city departments and contractors tended to have compatible viewpoints, as do citizens and the media. There are conflicts elsewhere along all the main dimensions of decision factors: Economic, Need, Environmental, Political, Quality of Life and Ethical issues. The findings are shaping the development of a suite that is relatively standard in design: a group decision support capability that contains a model of the decision process as-is (thesis) and a countermodel of as-could-be. Certain lines of investigation are being downplayed. The implementation team, for example, feels that
161
elected officials, who know how to use the current decision process to their advantage are likely to avoid or even attack analysis and recommendations based on the “rationality” of simulation models and related suite components. Accordingly, the preliminary design of the capability is a dialog management service and the studio process will explicitly help stakeholders learn about the key assumptions underlying the process at hand and the multiple perspectives driving it. Conflicting assumptions will be “isolated” – and a dialog built and observed. That dialog will center on the thesis model and the antithesis countermodel. The goal is to formulate the synthesis.
162
Vignette 12: A Participatory Studio for Resolving the Group Support System Paradox A major thrust in the extension of Decision Support Systems has for long been the development of what we term Participatory studios, where the aim is to get people involved in the decision process. The tradition of inquiry systems theory would see this as a somewhat limited approach that excludes effective enhancement of decision processes for addressing wicked problems. Participatory studio begins where the Houston infrastructure studio would leave off: it assumes a mutual willingness to collaborate and while there will always be some degree of conflict in any decision that matters – values, viewpoints, priorities, etc. – the differences are not blockages to moving as a group towards a decision commitment. Group DSS are thus very much the mainstream of practice in targeting interactive tools and simulation suites to decision processes. The distinction we make between useful, usable and used tools is highly applicable to Group DSS, groupware systems and GSS – the terms are largely interchangeable. For close to thirty years, many skilled designers, process facilitators and strategic planners have developed decision rooms – equivalents to DE studios – in which groups meet to employ GDSS to help them in their decision-making. The paradox is that while there is plenty of evidence that these aids improve the process (usefulness), that they are valued by the participants (usable), they do not take hold and become embedded in the organization. They are too often one-off cases, experiments, pilots, laboratories, and prototypes. Three leaders in the Group Support Systems field acknowledge the paradox and have tried to resolve it through an approach to studio design that is very close to that used to develop the airport and supply chain management simulations described in earlier chapters of Decision Enhancement Services. “ThinkLets” are the reusable building blocks for group support studios in order to help facilitators move towards systematic “collaboration engineering” via a codified packet of scripted prompts, choices and actions rather then the generally ad hoc routines that only a relatively small number of very skilled facilitators and change management specialists have at their command. We described their logic and deployment in Vignette 2 of Chapter 1.
163
ThinkLets are described in an article, Collaboration Engineering with ThinkLets to Pursue Sustained Success in Group Support Systems, by Robert Briggs, Gert-Jan de Vreede and Jay Nunamaker.92 It begins with the paradox. An analysis of 54 case studies shows that in 86.5 percent of the applications, the organizations and groups using a GSS reported an improvement in performance. Similarly, laboratory studies confirm their value. A survey with over 150 respondents showed an average annual payoff of $1.7 million on typical investments of up to $150,000 start-up costs (facilities, equipment and network) and $100-150,000 a year operating costs. A comparative case study of four organizations found a consistent increase in productivity: 50-70 percent per person time savings, 30-90 percent calendar time savings, and high levels of participant satisfaction. Yet very few successful GSS get extended across the organization. “Field observations at dozens of government, military and industrial sites reveal that GSS facilities often establish a small beachhead in an organization, but then do not diffuse beyond the initial installation. It also frequently happens that, after several years of active, sustained, and documented success, GSS facilities are abandoned by an organization…… positive research findings do not translate into technology transfer….” GSS installations become “self-extinguishing.” Other commentators have made the very same point. The authors’ explanation for this is that many GSS users reported high perceived user friendliness in their experiences with the tools when the studio is run by a skilled facilitator but at the same time they found it difficult to understand what the system was supposed to do for them and for their group when they were asked to run the system for themselves. GSSs tend to be rich, flexible, full-featured platforms with hundreds of configurable features that can be used to guide groups through many different kinds of collaborative efforts. However, there is nothing on the screen of a GSS to make it obvious what will happen when one or another of the features is activated. Each configuration can have a subtle, yet powerful, effect of the patterns of interaction among group members.” An example of this is how in the brainstorming stages of a decision process, the group handles ranking alternatives, voting on next steps and making and responding to comments. The system decides for them, including whether or not to make the interactions anonymous. A skilled facilitator intervenes and adapts the system to the group and vice versa. Facilitators who run GSS are either hired as external consultants or assigned as internal service providers. By definition, the external 92
Journal of Management Information Systems, Spring 2003, Vol. 14, No 4, pp 31-63
164
experts are not a self-sustaining community of GSS users, because, when the consultant leaves, “nobody remains to run the technology on behalf of teams.” Because GSS facilities operate across organizational boundaries and are used at best intermittently and only occasionally by any one group, internal facilitators lack a solid political and organizational base and are easy targets for budget cuts. “Even a superb business case for a GSS facility may not be sufficient to prevent its closure during a budget crunch. In a typical example, one GSS facility that we observed in a high-technology, capital-intensive industry reported a carefully documented ROI of more than 600 percent. Yet it was closed in less than two years. The business case can be insufficient for two reasons. First, the financial savings on GSS tend to be in terms of cost savings and cost avoidance. Thus, at the end of any one year, there is no line item on any budget that quantifies and draws attention to the value the GSS created for the organization……. Second, because GSS tend to be used to support ad hoc special events and non-routine problem-solving, facilitators cannot build a case for structural savings they might cause in the future.” Adding to this is yet another factor forcing self-extinguishment; the very special skills needed to be an effective facilitator mean that such individuals either get promoted out of the job without that role being institutionalized or it becomes a career dead end. The authors of the article cite an instance of a company that installed four GSS facilities and trained over a hundred people to be facilitators. A year later, only one room was in use and only a handful of them continued to practice as facilitators. This is all discouraging news. The authors do not end with that. They move on to explore the factors that make for self-sustaining GSS user communities. The most dominant one appears to be that the GSS was routinely used to support day-to-day execution of mission-critical tasks – not of routine tasks but of ones that address decisions that matter. An example is an avionics manufacturing plant that must operate to very high production standards. Several times a month, a quality problem emerged on the line. The line workers routinely moved to the GSS facility to locate the cause of the problem and reach a decision on how to solve it. They went through the same five steps each time they met. Therefore they had no need to learn or understand all the possibilities offered by their GSS software. They only needed to learn how to run their five steps. Another example cited in the article is a group of military intelligence officers with whom the authors worked with for several years. They used a GSS capability, sometimes all day and every day, in their situation assessment processes. Again, they applied a streamlined and simple
165
process, so that they needed only a fraction of the GSS features, which they learned in about an hour “rather than learning general GSS facilitation skills in a yearlong apprenticeship. After the training, they had no further need for the facilitator who had designed the process for them.” The authors developed two general principles from their research and practical experience: ¾Do not focus solely on “general purpose” facilities. Rather, concentrate on creating solutions for mission-critical collaborative efforts that recur frequently. ¾Do not rely on skilled general-purpose facilitators for longterm GSS success. Rather, have these people design effective collaborative processes, and train “practitioners” to run them. These principles lead to a recipe for collaboration engineering (CE) via thinkLets. CE relies on “packaging” facilitation skills. A thinkLet is defined as the smallest unit of intellectual capital required to create one repeatable, predictable pattern of collaboration among people working to a goal. It is everything a facilitator needs to know, in terms of process, tools, techniques and scripts, to predictably create a studio for collaboration. The thinkLets have a name, point to a GSS tool, identify a configuration, and include a script that prescribe the sequence of events and instructions given to a group to produce the collaborative step.
The Room is NOT the Studio “Decision rooms” are commonplace and it is natural to think of a studio as a physical room. For Decision Enhancement, the room is an important consideration but more in terms of its accessibility and appropriateness to the decision process than its ultra-high tech features; a superb room does not necessarily generate an effective process and outcome. A DE studio has five main components: ¾Studio style: Learning, Enquiry and Participative. ¾Decision process coordinators: these include facilitators, domain experts, and suite managers ¾Scripting: the balance formalized methods.
between
improvisation
and
166
¾Suites and development and support expertise. ¾Location and rooms: the options here range from fixed point to distributed Web conferencing and from simple technology infrastructure to multimedia heaven.
Designing Studios There are some preconditions for developing studios, that are common to all three styles of studio, whether Learning, Inquiry or Participatory. Studios are largely voluntary forums for collaboration. That means that stakeholders involved in the decision context must first of all be ready to collaborate and committed to a problem-solving frame of mind where they look to reach a joint decision. This overall goal must be unambiguous and criteria for success reasonably clear. There are, alas, many situations where these requirements do not apply and competition substitutes for collaboration, decision-makers withhold information and guard their options and some of them have as their goal the blocking, not the making, of a decision. We must acknowledge that it is far easier to build models – and also easier to run meetings – than to implement studios and we do not claim that our own and our colleagues’ efforts have arrived at an end point; our involvement in designing and applying studios is very much work in progress, but still progress. Consider the mix of skills required to fulfill the demands we listed at the start of this chapter – landscaping, facilitation, recipes, suites and process. Very few individuals have them all and the more skilled they are in their own business, technology or organizational specialty, the less likely it is that they will be strong in the other areas: ¾Landscaping: This is the domain of expertise of the business strategist and domain expert. Airport planning is an example of such a domain; a computer science simulation specialist without access to domain expertise here will be adrift in terms of both understanding the decision issues and decision-makers, information resources, processes and the basics of what to model, why and how. In addition, the landscaper has to have some credibility, whether as an insider or outside adviser, with senior managers and stakeholders. Otherwise, the studio is just an exercise or a “pilot”, “prototype” or “lab”, all of which are euphemisms for “don’t take this too seriously.”
167
¾Facilitation: Here, behavioral knowledge and process skills are key. Most companies have employed in training or change management programs brilliant magicians who know how to build teams, facilitate workshops and lead change management programs. Most of them, though, are among the last anti-technology bastions of business practice, which means that they have played very limited roles in Decision Enhancement rather than decision facilitation. Equally, though, the developers of technology suites are often as process-blind as the behavioral experts are technology-blind. They focus on decision solutions not on the process of arriving at a solution. ¾Recipes: apply wherever possible proven recipes that include effective scripts. Recipes are proven, repeatable and transferable, specify ingredients and sequencing, permit variations and innovations, and result in something people eat and are likely to come back for another meal. Building recipes requires research and writing and the willingness to place “secrets” and “methodology” in the public domain. It demands teaching as well: developing a body of knowledge and building a critical mass of skilled practitioners. Since technology moves so quickly, each new generation of software draws on a new generation of developer and there is little passing on of experience and knowledge. ¾Suite: ensure that tools are designed and implemented within an overall distributed architecture. The goal of suite development is to make the “system” as transparent, easy to access, reliable as the electrical system, where any breakdown is a news item and crisis. The information technology field is rapidly making progress in moving towards this goal, with the Internet, Web services, multimedia interfaces and component-based architectures combining to transform more and more of interactive technology development, application and use. But we still have a long way to go. There are few off-the-shelf tools that provide for easy integration so that in building suites we still need expert toolsmiths. Most of these are young, focused as they should be on their technology skills and roles, and in general with limited business interest and knowledge. ¾Process: make commitment to a decision the explicit target and agenda. The blockage here is organizational
168
culture, management style, stakeholder relationships and legacy of existing decision processes.
169
Chapter 8
Recipes: Applying Proven DE Experience Here is the definition of a recipe: “the beginning of a prescription or formula for a remedy; the means for attaining or effecting some end; the statement of the ingredients and procedure for making something.”93 Decision Enhancement needs recipes. Otherwise, it will remain largely ad hoc, a case–by-case application and unsystematic. Recipes are instruction guides built on proven experience. They are precise in their formulation but leave open options for variations in their details and provide room for improvements. There are many of them; there is no single dish. They are formalized and tested out by experts and passed on for others to use. By definition, they are repeatable. They emphasize the correct preparation of ingredients and sequencing of steps. Clusters of recipes form a style of cuisine. In this chapter, we review in detail the evolution and application of one particular DE service that enhances decision-making for business process transformation, and then summarize the main recipes that have emerged from the body of experience and insight accumulated in our work with a wide range of colleagues. We must stress that these are just the ones that reflect our own experience and sense of confidence that that experience is a reliable base for generalized application. There are many other recipes that researchers and practitioners have developed that do not fit into our own interests, skills and work. For example, our lens for landscaping DE opportunities is strongly focused on simulation modeling and visual thinking. We pay less attention to an area that other DE developers see more clearly in their landscaping, such as suites that rely on data warehousing, data mining, customer relationship management and other data-intensive areas of decision-making. The studio is the base for recipes. The studio brings DE services into action. The decision situation – its landscape, stakeholder communities and existing process base – is “out there.” It will stay out there until a studio brings the components together and creates an agenda for a studio. That agenda is made actionable through the studio decision process, which in turn makes actionable suite tools. The recipes we offer
93
Oxford English Dictionary definition.
170
in Decision Enhancement Services center on the development of three types of DE service: ¾Multi-actor infrastructure development: This recipe applies to the domain of infrastructure development in distributed decision-making arenas. It applies to environments where visual simulation modeling is a fully appropriate vehicle for both capturing the core of the decision phenomenon, context and agenda and where a building block approach to suite design and integration helps build an integrated capability. The recipe has evolved from at least fifty projects; the Hartsmartin example is a composite of many of these. ¾Participatory studio application: This recipe has also evolved from many initiatives and experiences and is illustrated in several of our vignettes. It extends Group Support Systems practice and impact through systematization aimed at institutionalizing the studios rather than producing “one off” cases. ¾Business process decision management: This is a research-derived recipe that provides a blueprint for recipe development as well as illustrating an effective target of application. It is highly domain-specific and applicable to business processes where cycle time transformation is W3 in impacts – a win for company, customer and business partner(s). The entire fields of computer science, decision support, multi-criteria decision-making, simulation, modeling, statistical methods and related disciplines urgently needs recipes. Instead, it generally is marked by cases and methodologies, two extremes of a spectrum of structuring of knowledge and experience for action. Cases dominate the research field and methodologies equally populate the consulting and vendor world. Recipes fall between these extremes. The reason they are needed is simple: cases do not generalize and methodologies are too generic. A new field of practice generally builds its momentum through some exciting new case example of a breakthrough. With DSS in the 1970s, the breakthrough was “what-if?” analysis via, first, interactive financial modeling software and then spreadsheets. For business process reengineering, it was basically just two cases, one from an insurance firm that cut the time to issue a policy from weeks to hours (and soon went bankrupt!) and Mazda’s handling of accounts payable. For customer relationship management, the exemplar was the probably
171
apocryphal tale of Proctor and Gamble’s discovering a correlation between sales of baby diapers and beer on Monday nights in the fall. (The explanation was that when Daddy was asked to stop by the supermarket on his way home and pick up the diapers he grabbed some beers as well, for watching Monday Night Football on TV.) With ecommerce, a small number of cases – Amazon, Priceline, Ariba, GE and others – were used to predict the death of bricks and mortar, ending of fixed pricing, trillions of dollars in business to bus e-commerce, and the dominance of real-time auctions. Such single cases do not generalize reliably. They are special and individual, almost by definition; that is what makes them striking and attention-grabbing. In addition, they do not provide a base for building and transferring expertise and experience. They do, though, provide ideas, test possibilities and generate momentum. Unless these cases are followed up on and recipes evolved, what typically happens is that that momentum slows, ideas dry up, and possibilities turn out to be either special cases, ho-hum recurrences or hype. Skilled consultants and trainers then develop their own implicit recipes but there is no base for systematically shaping a field of practice. The above paragraph explains why the DSS field – the precursor of Decision Enhancement – lost focus after a very promising beginning and why other areas of the application of technology suites and analytic methods remain almost frozen in time. In MCDM (multi-criteria decisionmaking) and simulation, for instance, powerful methods and models remain little more used than two decades ago. Cases are the learning curve for recipe-building but they are not in themselves recipes. The other extreme from case studies as the base for orientation and shaping of practice is structured methodology. Here, researchers, practitioners and/or vendors recognize the need to go beyond cases to principles and methods and thus develop often very detailed “cookbooks” and schematics for diagnostics, planning, design and implementation. The problem is that most of these are over-generalized and over-structured. They do not address the situational nature of the application context or the domain of applicability. Recipes are domainspecific. For example, in our review of studio styles in Chapter 7 we discussed an Inquiry studio recipe that has evolved from research and practice in the fields of the learning organization, knowledge management, and epistemology. It is being applied to urban infrastructure planning in Houston, a decision situation that on the surface corresponds closely to the Hartsmartin airport development context.
172
But the domains are different in one key regard. The Hartsmartin recipe is based on an assumption of collaboration as already established, limited dimensionality in terms of the issues included in the studio, and the decision problem as one that matters but is not totally wicked. The Inquiry system studio assumes conflict in assumptions as the core of the decision enhancement issue, an almost unbounded range of issues that may or may not become key for stakeholders, and wickedness everywhere. The Hartsmartin recipe leads to the use of visual simulation suites while the Inquiry dialectical theory recipe leads to a dialog management suite. The first implicitly includes “Avoid wicked problem situations – you don’t have the skills or services to offer.” The second implicitly includes “Avoid too much reliance on rationalistic methods and models.” Neither is “correct” but equally neither can be viewed as a general methodology for infrastructure development. DE practice will evolve through its recipes for studios and then through the linking of studios and suites. The nature of recipe development is illustrated by a highly effective approach to transforming cycle time in key business processes in a number of organizations, described below.
Developing a Recipe: Business Process Decision Enhancement Fundamentally, decision-making is a coordination process. The risk management example of turning a muddled administrative nuisance that damages productivity into a coordinated decision process via Group Support studios and thinkLets rests on new coordination mechanisms. The studios that we have described in Decision Enhancement Services are vehicles for collaboration – the most positive form of coordination. Many of the failures in decision-making, such as the Denver International Airport example, rest on failure to coordinate the roles and interests of stakeholders. When processes are poorly coordinated, they create delays, waste of effort, argument, miscommunication and muddle. These all reveal themselves as increased cycle time. The recipe for slashing cycle time in processes that matter in terms of decision impact shows the extent to which relatively small investments in DE studios and suites can have a substantial impact. Business Process Management – which we will contrast with Business Process Decision Enhancement – is the widely used generic term for emerging methods for improving process quality, cost, and speed via information technology tools. It offers the same potential contribution to business innovation as did earlier process movements, most obviously Total Quality Management and Business Process Reengineering. It also
173
runs the same risks as these of early promise turning into backlash and disappointment. It also reflects the ethos of process automation, rather than decision support. The BPM literature is fairly explicit about this, using terms like workflow automation, straight-through processing, and business rules. There are many procedures where this is very effective, but it seems in its cases and methodologies to provide a very narrow view of “process.” There are many processes that are fundamentally rest on effective decision coordination. They rely on people and on coordination of people. Our Business Process Decision Enhancement recipe is derived from analysis and interviews in a number of companies that have successfully implemented major business process transformation. All are among the leaders in their business sectors and their BPDE initiatives address a key component of their competitive positioning and customer relationships. The payoffs have been substantive – and measurable in “hard” quantitative terms not just “soft” qualitative ones. The investment has been relatively small; while full figures are not available, the direct cost appears to be in the range of one to two hundred thousand dollars. Typically, they take 3-6 months to implement by a core team of around 10 people, most of whom are not full-time on the project. Several of the applications have won awards from industry associations.
Figure 8.1: The Business Process Decision Enhancement model94
94
Source: Peter Keen, A Recipe for Business Process Management: Slashing Cycle Time. White paper, Action Technologies, actiontech.com, 2004.
174
The criteria for selecting these companies for analysis were (1) their common focus on cycle time as the target of BPDM opportunity and (2) their common view of processes as fundamentally built around the decision-making interactions of people, with those interactions leveraged by technology. This “closed loop” model and associated tools provide an effective recipe for business process transformation in companies where cycle time is core to both customer and company and where complexity of coordination and interaction largely generates slow cycle times: interaction, communication, judgment, negotiation, commitments, and exception-handling. Figure 8.1 shows the model. This “loop” model contrasts strongly with the workflow automation view of business process management that is represented in the typical comments below. That BPM approach explicitly aims at eliminating decision-making. It stands in fairly direct contrast to the approach of DE. Obviously, both BPM and BPDE offer many opportunities to organizations but equally we argue that applying the ethos of BPM workflow automation to decision-making processes is not appropriate except for the lower-level operational routines that are the converse of decisions that matter. Many BPM experts take the opposite view: “BPM emphasizes the management aspect of automating processes to achieve maximum ongoing flexibility with minimum development and time.” (Ebizq.net, 2002) “As enterprise processes become more automated, and more and more interconnected, one piece of technology refuses to go away: the human being.” (Workflow Meets BPM, Infoworld, April 27, 2002) BPM is: “Streamlining the system of delivering a document to the appropriate recipient and determining where and when the document is sent next.” (Business 2.0, April 2002, page 37) This is a narrow conception of process that offers effective recipes in some contexts but is applicable only to well-structured and routine procedures with few exceptions, limited options for customers/requesters, minimal flexibility in execution by process performers, and minimal opportunity for human capital and organizational skills to make a difference in the effectiveness of the process, as contrasted to its efficiency. These are procedures where the human element adds little value and the business rules can be automated.
175
BPM is a different domain from DE and hence requires different recipes. The quotations above neglect this factor and argue for a universal – domain-independent – approach to using technology to transform business processes.
The BPDE Recipe Recipes are domain-restricted. The recipe for beef stroganoff can be adapted to use yellow onions instead of pearl onions or to cut back on the garlic but it won’t work if the cook substitutes fish for beef. The tension between the viewpoint of BPM as automation of workflows and that of BPDE as leveraging people through technology is resolved if the recipes that each of them offers – proven, precise, repeatable, and tested out – are carefully and specifically defined in terms of their domain-applicability. Instead of providing “the” definition of business process and BPM – “streamlining the system of delivering a document” for instance – it is far more useful to offer to practitioners proven recipes for streamlining the delivery of documents and accept that other recipes differ in their implicit conceptualization of BPDM and hence their domain applicability. The art form in developing and applying recipes is thus to be very clear about their domain features. The BPDE recipe we review in this chapter is for the transformation of cycle time in business processes whose decision-making effectiveness rests on collaboration, coordination, communication, and commitment management among people, leveraged by technology. It is hard to see how a recipe for workflow automation could be effectively transferred to this context, or vice versa. Here are the main steps in the recipe sequence, with short comments added: ¾Step 1: Target “W3” priority processes for cycle time transformation. These are ones that are Win-Win-Win – W3 – for customer, company and business partners, and where cutting cycle time provides self-justifying, selfexplanatory benefits that do not involve complex ROI justifications. ¾Step 2: Narrow down the focus to processes where complexity and scale have generated muddle and waste. Do not omit Step 1 by simply looking for wasteful and muddled processes that do not affect business performance if they remain muddled and wasteful. Avoid
176
this typical mistake made by proponents of business reengineering which led to streamlining processes that did not and never would generate economic value added for the firm. ¾Step 3: Work with the process as is and transform it incrementally. Move in the style of TQM rather than BPR: a campaign of continuous change rather than a radical attack. Do not try to blow the process up and start afresh. That may be conceptually appealing but it creates the political, economic, organizational and cultural stresses that brought BPR from the best-selling “Manifesto for Business Revolution” to “Everyone knows business process re-engineering and management can mean trouble!”95 ¾Step 4: Build a small team led by a strong process owner from the business. The core team of around 10 people delivers phased results in 3-6 month increments. IT staff are technical advisers not developers. ¾Step 5: Dynamically map the process as is and as will be in terms of the coordination of requests, promises and corresponding negotiation-commitment loops. Process is people collaborating to deliver a service. Workflows are procedures within this contracting. They are not the process. All restaurants have essentially the same workflows but Chez Michel and Mike’s Tavern are very different in their processes: value criteria, flexibility of customer options, interactions between waiter and customer, and measures of value – collaborations, negotiations, and contracting. ¾Step 6: Move along a path of continuous, incremental improvements, guided by the maps. This is not a recipe for fundamental, radical and dramatic change. ¾Step 7: Scale, integrate and extend: Build, build, build. The cases, their process targets and the payoffs are summarized below (Figure 8.2):
95
Business Process Management Journal prospectus, emeraldinsight.com, May, 2002.
177
BPDE Target Ford Motor Company: Auto manufacturing General Motors: Auto manufacturing International Truck and Engine: Lockheed Martin: F-16 manufacturing Lubrizol:
Managing warranty services approval and distribution IT service delivery
Special quote for (very expensive) customized trucks Problem resolution
RR Donnelley and Sons: Publishing of multi-component education kits
New product development for specialty chemicals Graphics Management Division customized production and supplier collaboration
Telecom Inc. (a pseudonym)
Customized products and services
The Laser Center Company: Laser eye surgery
Customized products and services
Measureable Payoffs 69% cut in cycle time $1.5 million annual cost savings 80% cycle time cut Associated labor costs cut by 69% 40% cycle time out
50% cycle time out $1 million cost savings “Dramatic” impact (figures guarded by the company) 15% overall productivity gain. Increase in number of “complex component” projects that a team can handle increased by factor of 2-4 25% cycle time cut 25% salary savings 90% reduction in billing issues 40% cycle time cut
Figure 8.2: The BPDE recipe cases and payoffs
These are all very complex processes that span many departments and business partners and that involve many people. For instance, the Ford warranty approval process originally took 60-90 days and required the coordination of up to 15 different departments. The new BPDE system that cut decision cycle time to 23 days – a 69% improvement – has over 600 users on four continents. It includes most of Ford’s major brands, such as Ford, Lincoln/Mercury, Jaguar and Volvo. Charles Ragan, Lead Developer for Information Technology and E-business Infrastructure at Ford, describes the warranty approval process as “Warranty and recall work is the emergency room of Ford Motor Company….. Losses or savings here can affect the company’s financial position and shareholder value.” (Source: seminar presentation, 2002.) There are 45,000 user accounts, with 400-500 transactions a day. The BPDE system links to 13 production applications. The same scale, business criticality, complexity and volumes of communication apply to the other cases. International Truck and
178
Engine’s special quote process involves around 16,000 custom proposals a year for trucks that can cost over $100,000 each, and much more for prototypes. The Laser Center’s “branded center” startup processes are a race against time in a market of explosive growth but also aggressive and expanding third party market encroachment; each addition to TLC’s fifty centers is a heavy capital investment, with the start-up costs “prohibitive in all but the largest markets.” Opening a new center also requires coordination among many parties that do not routinely talk to each other: doctors, clinic managers, construction companies, real estate operators, utilities, etc. Lubrizol’s BPDM system for product approval has 1,500 users. Lockheed Martin’s QADS – Quality Assurance Document System – has 500. These are not pilots, prototypes, or streamlining of small-scale and simple processes with linear workflows – well-defined step by step procedures – and limited complexity. They are large in scale, important in their business impact, and above all meet the old adage that time is money. They are highly interactive, involving many negotiations, authorizations, collaborative discussions and exceptions. In many instances, the people-to-people coordination loops address exception-handling, which distinguishes real customer service and real collaboration from rote procedures. For almost every customer, the moment of truth in an established relationship is how a “problem” is handled, an error resolved, a personal touch provided, and an agent able, ready and authorized to override the fixed rules of the procedure. In many ways, any basic recipe for BPDE should be a mix of two principles: (1) embed routine procedures in software (automate) and (2) be exceptional in handling exceptions (leverage people). Product development, customer relationships, teamwork, business innovation are all processes where effectiveness – as contrasted to efficiency – rests on the non-routine elements of collaboration, coordination and commitment. These add up to a business service. The workflows are procedures within a service. The eight cases cover a wide range of specific processes and types of application but they share many common features – about their goal, the target processes, their focus on the human elements of process, scale of effort, risks, choice of tools and implementation strategy. It is these commonalities of aims, ingredients and procedures that make for an effective BPDM recipe for these companies in their business context. Change these and a very different recipe may be needed. Apply another recipe in this context and it will probably fail. But conversely, apply it in comparable contexts and it moves BPDE practice forward. Add more such recipes derived from other sets of case situations and mesh them
179
selectively with tool- and method-centered perspectives on BPDE and we will have a body of practice that can help build critical mass.
Developing the BPDE/CT (Cycle Time) Recipe The approach taken to develop a DE recipe is simple: 1) First, define the domain features: In this situation, the key commonality among the cases that provide the experience base for the recipe is that their DE focus was on cutting cycle time and that they have achieved very substantial improvements in this area. The second commonality is that they are primarily in manufacturing. The third is their adoption of the closed loop process model. Financial service companies were excluded from the study not because cycle time is unimportant to them – consider speed of handling mortgage applications, for instance or launching a new service – but that the many differences in market segmentations, customer relationships, transaction processing, organization of financial services versus manufacturing products, and many other factors could – not necessarily would – blur the picture and obscure the patterns. The third commonality is that all the companies adopted the same approach to process mapping and the development of BPDM software applications. Hence the domain features here are: ¾Recipe target of opportunity: decision cycle time in processes that are a recognized business priority ¾Business context: Manufacturing services and product development ¾View of process: Coordination of the “loops” – requests, promises and hence commitments – among multiple actors ¾Methods and tools: Dynamic process mapping of the loops and creation of software to coordinate them. ¾Implementation: rapid evolutionary development without traditional functional specifications; a business team drives all implementation, with the IT function as expert advisors on architecture, data management, security, etc.
180
2) Use all available sources of information to reconstruct what happened – motivation, business justification, inception, planning, execution and impact – via articles, interviews, management presentations, technical plans, and case reports. The focus in developing recipes is on the decision process; here the richness of case data is essential to capture what worked and why. 3) Look for the DES commonalities that explain the successes. If there are few then there is no recipe. Where the patterns are consistent, they form the base for defining the recipe. Obviously, this is a highly interpretive exercise. The focus is on moving from case richness towards methodological precision and grounding. 4) Assess if and where those commonalities form patterns of action and impact that add up to a reliable set of recommendations for BPDM practitioners to adopt and adapt in their own context. If so, this constitutes a recipe. Earlier chapters in our book capture the two main recipes that we ourselves have developed in collaboration with many colleagues – infrastructure development and group decision-making. They are selective and domain-specific. That is a key element in building Decision Enhancement services and DE practitioners will develop and apply their own recipes, many of which will be different from our own. Decisionmaking is highly situational and DE must recognize and adapt to the situation. In all cases, the key issues are to identify the domain in which the recipe is applicable, define the criteria for the DE studio and then build the DE suite. Recipes are the end point of Decision Enhancement as a field of research and practice. In Decision Enhancement Services we have laid out a “story line” for a journey where executives, their advisers, change management specialists, experts in multi-disciplinary fields and technology developers can come together to make a substantive new impact on effective decision-making in any organization.
181
References Alter, S. (1979) Decision Support Systems: Current Practice and Continuing Challenges. Pearson Addison Wesley. Arnheim, R. (1989) Visual Thinking. University California Press. Ashe, S. (2003) Exploring Myst’s Brave New World. Wired, June 2003. Barros, F.J., Lehmann, A., Liggesmeyer, P., Verbraeck, A. and Zeigler, B.P. (2006) Component-Based Modeling and Simulation. Dagstuhl Seminar Proceedings 04041 (http://drops.dagstuhl.de/opus/volltexte/2006/457). Boehm, B., Grunbacher, P. and Briggs, R. (2001) Lessons Learned from Four Generations of Groupware for Requirements Negotiation. IEEE Software, May/June 2001, pp. 46-55. Boyson, S., Harrington, L.H. and Corsi, T. (2004) In Real Time: Managing the New Supply Chain. Praeger Publishers. Briggs, R.O., Vrede, G.J. de, Nunamaker, J.F. jr. (2003) Collaboration Engineering with thinkLets to Pursue Sustained Success with Group Support Systems. Journal of Management Information Systems, Vol. 19 No. 4, pp. 31 – 64. Cane, A. (2004) Lights. Action. Animate. Financial Times IT Review, July 7, 2004. Carlsson, C. and Turban, E. (2005) DSS: Directions for the Next Decade. Decision Support Systems, Vol 33, pp. 105-110. Churchman, C.W. (2004) The Design of Inquiring Systems. Journal of Information Systems, March 2004. Courtney, J. et al. (2002) A Dialectic Methodology for Decision Support Systems Design, HICSS. De Neuville, R. and Odoni, A. (2002) Airport Systems: Planning, Design, and Management. McGraw-Hill Professional. De Vreede, G. and Briggs, R. (2003) Group Support Systems Patterns: ThinkLets and Methodologies. Proceedings of the 36th Hawaii International Conference on Systems Sciences. De Vreede, G., Koneri, P.G., Dean, D.L., Fruhling, A.L., Wolcott, P. (2006) A Collaborative Software Code Inspection: the Design and Evaluation of a Repeatable Collaboration Process in the Field. International Journal of Cooperative Information System 15(2): 205228. Dunn, P. (2002) Leonardo’s Laptop, Frontwheeldrive.com. Earle, N. and Keen, P.G.W. (2001) From .Com to .Profit: Inventing Business Models that Deliver Value and Profit. Jossey-Bass. Finger, P. and Smith, H. (2003) Business Process Management: The Third Wave. MeghanKiffer Press.
182
Hamel, G. and Prahalad, C.K. (1996) Competing for the Future. Harvard Business School Press. Harvey, J. (1996) The Abilene Paradox and Other Thoughts on Management. Jossey-Bass. Hicks, D. (1999) Simulation Needs a Jump-start. IEEE Solutions, October. Janis, I.L. (1972) Groupthink: Psychological Studies of Policy Decisions and Fiascoes. Houghton Mifflin Co. Johansen, R. (1984) Teleconferencing and Beyond: Communications in the Office of the Future. McGraw-Hill Data Communications Book Series. Kasper, G. (1996) A Theory of Decision Support System Design for User Calibration. Information Systems Research Journal, Volume 7, No. 2, pp. 215-232. Keen, P.G.W. and McDonald, M. (2000) The eProcess Edge: Creating Customer Value & Business in the Internet Era. McGraw-Hill. Keen, P.G.W. and Scott-Morton, M.S. (1978) Decision Support Systems: An Organizational Perspective. Pearson Addison Wesley. Keen, P.G.W. (1997) Business Multimedia Explained. Harvard Business School Press. Keen, P.G.W. (2004) A Recipe for Business Process Management: Slashing Cycle Time. White paper, Action Technologies, actiontech.com. Kerzner, H. (2001) Project Management: A Systems Approach to Planning, Scheduling, and Controlling. Wiley, pp. 638-670. Kim, E.E. Tufte on Visualizing Information, originally appearing as an “Op Ed” feature on the Doctor Dobbs magazine Web site, August 1997. Kohse, U. (2003) Supporting Adaptive and Robust Airport Master Planning and Design in an Uncertain Environment. Delft University of Technology. Konicki, S. (2002) Revving Up: No. 4 Carmaker Toyota Hopes To Leapfrog Competitors With Digital-Modeling Software That Links Production And Demand. InformationWeek, April 1, 2002. Kouzos, J.M. and Posner, B.Z. (1995) The Leadership Challenge. A. Wiley Company. Lewis, M. (2003) Moneyball: The Art of Winning an Unfair Game. W.W. Norton & Company. March, J.G. (1994) Primer on Decision Making: How Decisions Happen. Free Press. Mason, R. and Mitroff, I. (1973) A Program of Research for Management Information Systems. Management Science, 19(5): 475-487. Meadows, D. (March 2003) Groping in the Dark: the First Decade of Global Modeling, quoted in www.robbert.ca/gs/quotes.html. Murray, E. et al. (2001) Randomized Controlled Trial of an Interactive Multimedia Decision Aid on Benign Prostate Hyertrophy in Primary Car. British Medical Journal. September 1, 2001. Nevis, E.C., Lancourt, J. and Vassallo, H.C. (1996) Intentional Revolutions: A seven-point Strategy for Transforming Organizations. Jossey-Bass.
183
Nutt, P.C. (2002) Why Decisions Fail. Berrett-Koehler Pub. Porter, M.E. (1996) Competitive Advantage. Free Press. Samii, M. (2003) Making the Truth Visible – Designing Graphics for Visual Reporting Tools. www.sapdesignguidl.org, August 6, 2003. Schoemaker, P.J.H., and Russo, J.E. (2002) Winning Decisions: Getting it Right the First Time. Doubleday. Schrage, M., and Peters, T. (1999) Serious Play: How the World’s Best Companies Simulate to Innovate. Harvard Business School Press. Schwartz, P. (1992) The Art of the Long View. London, Century Business. Simon, H.A. (1997) Administrative Behavior. Free Press, 4th Edition. Silverstone, S. (1999) Visual Thinking with Arie de Geus. Knowledge Management (destinationkm.com), September 1, 1999. Sowell, T. (1996) Knowledge and Decisions. Basic Books. Sukoweicki, J. (2004) The Wisdom of Crowds. Doubleday. Swain, J.A. (2001) Power Tools for Visualization and Decision-making. ORMS Today, February 2001, www.lionhrtpub.com. Thaler, R. and Sunstein, C. (2003) Who’s on First. The New Republic, September 1, 2003, pp. 27-30. Tichy, N.M., and Cohen, E.B. (1997) The Leadership Engine: How Winning Companies Build Leaders at Every Level. HarperCollins Publishers, 1st ed. Tufte, E.R. (1983) The Visual Display of Quantitative Information. Graphics Press. Van der Heijden, M.C., Saanen, Y.A., Van Harten, A., Valentin, E.C., Ebben, M.J.R., Verbraeck, A. (2002) Safeguarding Schiphol Airports accessibility for freight transport The design of a fully automated underground transport system with an extensive use of simulation. Interfaces Vol. 32, No 4, pp. 1-19. Van Laere, J. (2003) Coordinating Distributed Work: Exploring Situated Coordination with Gaming-simulation. Delft University of Technology. Verbraeck, A. and Valentin, E. (2002) Simulation Building Blocks for Airport Terminal Modeling. Simulation Conference, Vol. 2, Issue 8-11 Dec 2002, pp. 1199-1206. Verbraeck, A., Corsi, T. and Boyson, S. (2003) Business Improvement through Portaling the End-to-End e-Supply Chain. Transportation Research Part E: Logistics and Transportation Review, Volume 39, Number 2, pp. 175-192(18). Wack, P. (1985a) Scenarios: Uncharted Waters Ahead. Harvard Business Review, 5 (Sept./Oct.), 72–89. Wack, P. (1985b) Scenarios: Shooting the Rapids. (Nov./Dec.), 139–150.
Harvard Business Review, 6
Wagner, H.M. (1975) Principles of Operations Research: With Applications to Managerial Decisions. Prentice Hall.
184
Walker, W.E, Lang, N.A., Keur, J., Visser, H.G., Wijnen, R.A.A., Kohse, U., Veldhuis, J., and De Haan, A.R.C. (2003). An Organizational Decision Support System for Airport Strategic Exploration. In Tung Bui, Henryk Sroka, Stanislaw Stanek, and Jerzy Goluchowski (eds.), DSS in the Uncertainty of the Internet Age, pages 435-452, Publisher of the Karol Adamiecki University of Economics in Katowice, Katowice, Poland. Williamson O. (1995) Organization Theory: From Chester Barnard to the Present and Beyond. Oxford University Press.