Fear and Loathing of Evidence in Design Research

Depending on the discourses you follow, you might notice “design-led everything” has charged ahead with design thinking, speculative and design futures, empathic HCD and so on. Design research and advanced methods have lagged in these discourses. Emerging designers could easily believe that a product/service business case can be supported by small-sample field observations and a keen sense of empathy for participants (i.e., would-be customers).  At many design schools, the belated rise of human factors (endorsed in second gen design methods) has drifted off into mixed-mashes of methods. I hate to admit how little emphasis we give to evaluation in my courses now – in the 1990’s, usability and contextual evaluations established a positive reinforcing process in competitive growth cycles. Back when money wasn’t cheap and the web wasn’t monetized by surveillance adverts, you couldn’t afford to launch a weak product. Business decisions were based on data, and we had “real data.” So the recent arguments about evidence are puzzling. Perhaps its only the difference between “evidence-based design” and what I call “design with evidence.”

Don Norman has been advocating an evolution in design thinking and education with a stronger role for evidence. Recently, Don listed a scale of levels of rigor in design practice, ranging from designerly intuition to math and engineering models. I added “inputs” to this list to suggest kinds of evidence for each mode of reasoning. Evidence can be construed as a design input, and information objects in their own right, discovered in research and evolved through practice. I break the list at 4-5, where the gap between evidence and “argument only” shows up. Into this gap I would propose “Stakeholder models,” or structured participant observers in co-creation or dialogic design. These are highly structured, qualitative but rule-based propositions elicited and validated within a purposive social context. I think they deserve a different type of evidence structure, quantitatively-informed, validated stakeholder constructs. In many cases where we’re dealing with systemic or developmental complexity (observing while changing things) stakeholder models are a useful construct.

Design Reasoning Inputs and Objects
1. Craft-based, sharply honed intuition Design materials and repertoires.
Objects: Cases, forms, precedents
2. Rules of thumb: heuristics Principles, Frameworks, Heuristics as canonical types. Objects: Cases, frames
3. Best practices (case-based) Inducing patterns from cases, formalized insights. Objects: Cases, benchmarks
4. Design patterns (modified to account for the current problem) Abductive, patterns adapted, catalogued and applied in new cases.
Objects: Cases, comparisons, argumentation
X. Stakeholder models Structured observations by stakeholders in context.
Objects: System maps, diagrams, visual and verbal representations, narratives
5. Qualitative rules of practice Observations, Categories, Participant data, Non-probability samples. Objects: Emic user data, verbal & video content and protocols, images, user prototypes, constructed, narratives
6. Quantitative rules Structured data, defined variables, survey protocols, distributions from probability samples. Objects: Statistics and structured summaries, Proportional maps
7. Computer models Inputs to models defined from hard cases, reference data sources, and statistics databases. Objects: Simulations, Functional models,
8. Mathematical models Variables, parameters, and reference models (statistics). Objects: Algorithms, Equations

Degrees of rigor are not necessarily “better evidence.” The right type of evidence for a stage of design is more crucial than rigor. While 6-8 are conventionally more “rigorous,” they tend to hide the meaningful evidence, the particulars and discovered patterns employed in design practice. Statistics aggregate observations from a thinner set of observations, such as surveys or big data, and as such abstracts the evidence away from the source, from the people or settings of use. This is why usability evaluation in situ remains such a powerful form of evidence, as a single situation in context can be more convincing to sponsors and engineers than a large scale user survey. (Then of course, the convincing interaction may need replication or causal support from quantitative probing studies).

What do we really know about evidence-based design? Is it a developing design practice trend, or more of a mode of design research that drives design decisions? Is it premature or perhaps overstating the case, perhaps we might find more agreement if we described models of “designing from evidence?”

Evidence-based design (EBD) has already been reframed, in different ways, from architecture, design education, research and practice. EBD is not a thing, yet. It’s not what we think it is. This is not a monolithic practice that threatens design traditions.  In practice, understanding the contributions of science and standards of evidence contribute to better design decisions in any complex sociotechnical system. Evidence is not a passive construct, it requires a process of data collection and interpretation. Collecting data about or during a design inquiry doesn’t diminish creative or interpretive design approaches.

But perhaps another concern is more that a turn toward evidence alters the balance of power designers have worked so hard in the last decade to achieve. Design thinking and co-creation practices have been endorsed as powerful allies in business and social innovation, and perhaps in some ways a strong evidence approach competes with co-creation. I would suggest this is a necessary return to balance of the reality of the purposes of design thinking – to ensure effective and desirable products and services are developed based on an honest appraisal of the humans in the social systems of use. These are customers, end users, organizations, marketplaces. Evidence returns some power to the reality of current needs and functions, perhaps at the expense of conducting longer, more experimental projects. However, in domains such as healthcare, public service, and many high-risk applications, the necessity for not just managing costs, but employing systemic design to drive down costs while serving constituents is one of the main drivers inviting design practices into these otherwise forbidding arenas.

Most systems I’ve designed have included both types (or multiple methods) of design research. Certainly when working with development teams and product managers, the “harder” evidence – user data – is always more convincing than generative or conceptual design cases.

There’s been a long tradition of evidence-based design in healthcare, based in studies of environmental design and architecture in facilities and care practices. Its major proponents have been doing safety and systems-oriented research and intervention since the early 1980’s, and if you search “healthcare design” these are the precedents that show up. (See the venerable Center for Health Design) CHD studies have made a huge difference in quality of care and patient safety over the last 30 years. Design enhancements such as in-room artwork, access to natural scenery and living plants have resulted in decreased length of stays, improved service experience, and other softer outcomes, such as lessened anxiety. We can measure these things and make a convincing case for expensive and significant facility changes (to make any change to hospital environment is expensive, as it must be durable and repeatable option for all patient rooms or locations). The now-current knowledge that “single patient rooms lead to better health outcomes” is both patient-centred and evidence-based. But hospitals would never accepted the expense of essentially doubling the number of rooms based on patients preferring it. They do measure hard outcome data, and outcomes are a major design criterion.

Evidence is not necessarily a positivist position, even if the tradition of EBD tends to be so. Evidence is merely “based on data” as opposed to expert judgment or collective agreement, which are interpretive modes. In fact, collecting interpretive data from users, rigorously, is evidence. Patient narratives are a type of evidence. If we don’t collect data, we’re at significant risk of interpretive risks in making design decisions that affect safety, human welfare and finances. So just as scientists argue about the meaning of data, so ought we.

Evidence and its alternatives are not an either/or proposition. In fact, there is no “or” to be found. There is little risk of epistemological contamination by adopting the value proposition for evidence in design.

In systemic design there needs to be a balance of methods and perspectives, as complex systems (at least) are many-sided and many-functioned operations which no one person can understand in whole. Every contribution to knowledge helps.

In healthcare, the trend that is balancing evidence-based care is patient-centred care. But very few organizations have produced meaningful approaches that all understand as patient-centred. There’s pretty good agreement around “levels of evidence” and research standards, there’s almost none for patient-centred care. The definition of PCC seems to be getting fuzzier, not clearer, as more stakeholders adopt a patient-centred view, and then are stopped by the uncertainty, perhaps, of how to best implement the value in real care setting.

PCC is not patient experience, or patient satisfaction, PCC is interpreted very differently between clinical professions, and differently across institutions.

Are some hospitals advocating a trend “away” from evidence and toward “patient centricity” when they don’t agree what that is? And when they get closer to it, PCC may tend to blow up the business model and workflows.

Unless design thinkers make a culture out of evidence, it will become a complementary mode driving research, and helping designers make “unassailable” design proposals in complicated and risky situations.

If non-clinicians actually look at how the evidence behind medical practice is treated, they’d realize that no expert “lets the evidence decide.”

The reliance on clearly established precedent and the “literature” is a starting point for clinical decisions – diagnostics, medications or surgical therapies are complex decisions and require the best known answers before expert judgment is applied. The risks are too high not to. Yes, in hospitals residents execute much of this and they don’t build long-lasting personal relationships. They are residents. But nurses, who have championed patient-centred care and tend to practice it philosophically even if it’s not standardized, demonstrate in many ways affective and interpersonal qualities we associate with PCC.

Certainly clinicians who actually work in healthcare are not going to wish away evidence supported decisions anytime soon. When we seek to deliver design value at organizational and social/policy levels, we’re dealing with high degrees of complexity and the difficulty of sustaining a presence long enough to make a difference. Gaining agreement on courses of action is critical in these domains. Evidence helps us build the case for stakeholder agreement, especially across strongly contested views and positions, where power is involved or people have possible losses.

But service design and whole system (integrated IT and process) design require both evidence-based and x-based. And I’d like to hear more of what those other “x’s” are, because I never saw a conflict between research-led design and exploratory design. They are usually different stages, but I will say that in corporate work I’ve found you rarely get paid to explore. In design school our students usually want to just explore and save evaluation for “later in the career.”

To make better cases for systemic work with mission-critical services and integrated systems, we need to move beyond our own prejudices of what these categories might mean. We have to read studies, learn from scientific research and design research, from our peers and dialogues. And I would make a case for integrated methods and multi-perspectives.

 

Comments are closed.