Analyzing Design Review Conversations
eBook - ePub

Analyzing Design Review Conversations

  1. 516 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Analyzing Design Review Conversations

Book details
Book preview
Table of contents
Citations

About This Book

Design is ubiquitous. Speaking across disciplines, it is a way of thinking that involves dealing with complex, open-ended, and contextualized problems that embody the ambiguities and contradictions in everyday life. It has become a part of pre-college education standards, is integral to how college prepares students for the future, and is playing a lead role in shaping a global innovation imperative. Efforts to advance design thinking, learning, and teaching have been the focus of the Design Thinking Research Symposium (DTRS) series. A unique feature of this series is a shared dataset in which leading design researchers globally are invited to apply their specific expertise to the dataset and bring their disciplinary interests in conversation with each other to bring together multiple facets of design thinking and catalyze new ways for teaching design thinking. Analyzing Design Review Conversations is organized around this shared dataset of conversations between those who give and those who receive feedback, guidance, or critique during a design review event. Design review conversations are a common and prevalent practice for helping designers develop design thinking expertise, although the structure and content of these reviews vary significantly. They make the design thinking of design coaches (instructors, experts, peers, and community and industry stakeholders) and design students visible. During a design review, coaches notice problematic and promising aspects of a designer's work. In this way, design students are supported in revisiting and critically evaluating their design rationales, and making sense of a design review experience in ways that allow them to construct their design thinking repertoire and evolving design identity.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Analyzing Design Review Conversations by Robin S. Adams,Junaid A. Siddiqui in PDF and/or ePUB format, as well as other popular books in Design & Design History & Criticism. We have over one million books available in our catalogue for you to explore.

Information

Year
2016
ISBN
9781612494395

design
inquiry

5

inquiry
coaching

Robust Design Review Conversations

Andy Dong
Massimo Garbuio
Dan Lovallo

Background

The decision to take a product from its conceptual design into detailed design has properties of strategic decisions as defined in the strategic management field. The decision irreversibly commits a significant investment of resources (high degree of commitment) toward delivering or expanding a new product or service (changes the scope of the firm) (Shivakumar, 2014). Researchers and scholars in engineering design and new product development have, as such, motivated their research in decision-based design as having relevance to the strategic nature of these decisions. To improve the quality of design decisions, scholars of design decision making have tended to focus on how to take a decision for a range of tasks consistently faced in new product design and development (Krishnan & Ulrich, 2001). Perhaps the most important decision taken during design is concept selection, the analysis and evaluation of alternative concepts, leading to the selection or consolidation of one or more concepts for further development. A range of normative decision-making tools and methods for concept selection exist, including concept screening (Ulrich & Eppinger, 2004), pair-wise comparison charts (Dym, Wood, & Scott, 2002), concept scoring matrices (Frey et al., 2009; Pugh, 1981), multi-attribute utility analysis (Scott & Antonsson, 1998; Thurston, 1991), and Pareto dominance (Malak & Paredis, 2010). As a consequence, there has been a very robust and long-standing debate surrounding the decision method to use once a discrete set of alternatives is known (Frey et al., 2009; Hazelrigg, 2010; Reich, 2010) and the type of design decisions for which the axioms of decision theory ought to be applied (Hazelrigg, 1998; Thurston, 2001).
Lost in this debate, though, is the quality of the decision-making process itself. Taken together, a broad body of research in the strategic management literature points to the conclusion that decision processes matter to the performance of the project first and to the performance of the firm second (Fredrickson & Mitchell, 1984; Papadakis & Barwise, 2002). More recently, two studies have pointed toward the importance of conversations over numbers and financial analysis in decision making. First, in a study of new-to-the-firm products, resistance was won by using micropolitical strategies. This happened by building a coalition of supporters but especially by framing the product in terms of the firm’s existing products, strategies, and competitive thrusts (Sethi, Iqbal, & Sethi, 2012).
Second, a large sample study of strategic decisions has highlighted how strategic conversations are substantially more important than the financial analysis of a decision in shaping the outcomes of such decisions (Garbuio, Lovallo, & Sibony, forthcoming). In this study, it was “how” the executives talked about the decision and its underlying assumptions that had an impact on whether expectations in terms of market share or profitability were met, not “what” financial analysis was performed.
Building on research on the quality of design dialogue in accomplishing actions and practices (Luck, 2009) that enable the emergence of tangible goods (Dong, 2007; Oak, 2011), this study aims to contribute to the decision-making scholarship in design by investigating the quality of design review conversations. We focus on the situation of the review of design concepts presented throughout a junior-level (third-year) undergraduate industrial design course and the final presentations of an entrepreneurship course at a public university in the United States. The conversations in the industrial design course contain discussions about multiple design concepts, which can lead to the abandonment or further development of design concepts until a final concept is chosen. In contrast, the entrepreneurship presentations communicate a single project and are representative of the type of presentation to an executive committee tasked with making a resource allocation decision (i.e., a go/no-go decision).

Theoretical Frameworks

The evaluation of a design concept is a key part of the design process. By evaluation, we mean assessing the merits and shortcomings of proposed design concepts (e.g., non-fully elaborated ideas for new products), which takes place throughout the design process until a single, fully elaborated candidate design is selected as the final option. To assist designers in filtering concepts, researchers have proposed creativity metrics (Nelson, Wilson, Rosen, & Yen, 2009; Oman, Tumer, Wood, & Seepersad, 2013; Shah, Smith, & Vargas-Hernandez, 2003; Verhaegen, Vandevenne, Peeters, & Duflou, 2013). The problem we see is that these evaluation metrics call for deductive reasoning, such as in quantifying novelty by comparing an idea to a universe of ideas (Maher, 2010; Shah et al., 2003). If the idea is the designer’s own, then the designer may prefer it to others even when there is no rational basis for the preference (Nikander, Liikkanen, & Laakso, 2014). Empirical research in industry for concept evaluations also describe decision makers as tending to apply variables amenable to deductive analysis including product timing, staffing, and platform when evaluating innovative projects (Krishnan & Ulrich, 2001; van Riel, Semeijn, Hammedi, & Henseler, 2011). Even in the situations when the concept is in its early phases, evaluation techniques employ highly deductive analysis requiring a substantial amount of criteria for analysis (Ulrich & Eppinger, 2004).
If the purpose of design evaluations were to evaluate concepts only as presented with no further elaboration possible, then these types of metrics make sense. The accepted practice is that design evaluation per se in the selection phase (i.e., when decision makers are presented with a discrete set of options) should only examine the merits of options or “merit-based evaluation.” However, we believe that design evaluations should always entail both the evaluation of the quality of the design concept and be “forward looking” for “what might be” or “opportunity-based evaluation.”
Thus, rather than the evaluation of a design concept being “static,” based only on existing evidence, we propose a dynamic model. We hypothesize that a robust design review conversation should consist of at least two components. The first is strategic analysis. We define strategic analysis in the design context as the extent to which decision makers use evidence to evaluate design quality based upon a priori design criteria such as the requirements. When we refer to evidence, we mean propositions that justify a belief; propositions may include inter alia:
•Observable properties of the concept, such as physical characteristics.
•Arguments based upon belief or experience, such as professional standards.
•Secondary data, such as consumer preference data.
•Claims, such as conclusions drawn from prior evaluations of the design concept.
We hypothesize that the second component of a robust design review conversation is the quality of generative sensing. We define generative sensing as the process of creating new hypotheses to explain, resolve, or challenge the evidence in favor of or against a design concept, evidence that was itself generated from an evaluation of the design concept. Generative sensing based upon the output of the evaluation of the design concept can lead to new knowledge that changes the designer’s view of the design concept, resulting in a reframing of the problem itself (Dorst & Cross, 2001). In design, making the leap from the evaluation of a design concept to a final design concept is not solely about testing the merits of the design concept as a fait accompli. It is about generating a series of tests of the design concept until an appropriate concept is identified. In the context of design evaluation, generative sensing entails inferences to explain the evaluation. These inferences may provide resolutions to problems identified by the evaluation when the evaluation is adverse. In contrast, a positive evaluation may spur the proposition of conditions that would undermine the basis of the evaluation in order to test the robustness of the evaluation.
Our concept of generative sensing shares some ideas with the concept of the primary generator (Darke, 1979). A primary generator is a conjecture, or better stated, a scheme based upon a value judgment as the basis for generating potential solutions. The value judgment, which does not satisfy all constraints, provides a “way in to the problem” (Darke, 1979, p. 38). Generative sensing entails producing hypotheses that may resolve (or further expand) issues encountered in the evaluation of a design concept. Thus, rather than a “way in to the problem,” generative sensing can be seen as creating alternative “ways through the problem.”
The form of logical reasoning underlying generative sensing is abductive reasoning. The concept of abduction in design is philosophically very powerful as it introduces a mechanism of discovery through a form of logical reasoning. Scholars have theorized that the relevant form of abductive reasoning in design is innovative abduction. Innovative abduction produces an explanation (the design concept) for the desired value, the function, and, in turn an explanation (the form) for the design concept (Kroll & Koskela, 2014; Roozenburg, 1993). As Dorst writes, designers must engage in a form of reasoning “to figure out ‘what’ to create, while there is no known or chosen ‘working principle’ that we can trust to lead to the aspired value” (Dorst, 2011, p. 524). The term “value” is not restricted to economic or financial value, but, rather, any values to which the designer aspires (Friedman & Kahn, Jr., 2003; Le Dantec & Do, 2009; Lloyd, 2009). In other words, abductive reasoning in design generates hypotheses that, if true, would explain the form of the proposed product and its mode of operation to achieve a desired value (Roozenburg, 1993). Design theory scholars propose that the major premise that abductive reasoning must infer is the rule that connects a form to its function within an operating environment (Zeng & Cheng, 1991). This logical reasoning from function to form appears to refer to Sullivan’s widely cited credo that “form ever follows function” (Sullivan, 1896), although scholars of abductive reasoning in design do not refer to Sullivan explicitly. If function or value is intentional, then innovative abduction in design is about inferring a form that achieves an intended purpose. The purpose may not necessarily be utilitarian or performative.
Roozenburg (1993) introduces the following notation to describe innovative abduction:
q a given fact (function or value): q
p ⇒ q a rule to be inferred first: IF p THEN q
p the conclusion: p
Kroll and Koskela (2014) extend the model of abduction proposed by Roozenburg (1993) and Dorst (2011) into a two-step recursive inference of the innovative abduction: the first step involves abduction of a concept given a function and the second step involves abduction of a form given the concept inferred from the previous step.
q a given fact: function
p ⇒ q first conclusion: IF concept THEN function
p second conclusion: concept
q a given fact: concept
p ⇒ q first conclusion: IF form THEN concept
p second conclusion: form
We propose that the process of design does not (should not) arbitrarily stop. In other words, the participants should continue to propose hypotheses that infer the link between function and form in a recursive manner. Each inference is only a partial resolution of the design problem, the depth of which depends upon the complexity of the problem and the number of subproblems to be resolved (Zeng & Cheng, 1991). Thus, inferring the working principle (concept), which is comprised of mode of operation and way of use (Roozenburg, 1993), can entail multiple recursive inferences....

Table of contents

  1. Cover Page
  2. Halftitle Page
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. Acknowledgments
  7. Analyzing Design Review Conversations
  8. Design Inquiry
  9. Design Discourse
  10. Design Interactions
  11. Design Being
  12. Design Coaching
  13. Author Biographies and Contact Information
  14. Index