Matthew T Grant

Icon

Tall Guy. Glasses.

How Do YOU Measure the Impact of Design?

design metrics

Five long years ago, I wrote a piece entitled, “Return on Creative.” The crux of that essay was that design was critical to business success and, naturally, that a clear understanding of business principles and a focus on creating value was critical to successful design.

This was part of marketing campaign that we were running in order to position Aquent as the company that “got” both business AND design, making us the perfect choice for any organization looking for increased efficiency from creative execution (as we often called it). Of course, it also jibed with the growing (and still prevalent) trend amongst AIGA-istas and DMI-ers to insist that design deserved a “place at the table” – that is, the table where important business decisions are made.

This “place at the table” thinking has been questioned by folks like Michael Bierut and, more recently, Dan Saffer. Bierut sees it as symptomatic of an insecurity complex and insists that designers should focus on being good at design, not business. Saffer says that designers need allies at the table, but should relish their place away from it as outsiders who can “speak truth to power.” As high-falutin’ as that may sound, Saffer rightly emphasizes that, place at the table or not, designers need to be able to explain their work and decisions in business terms.

When a client or manager asks about the return on investing in “good” design, she wants to translate it into the language of profit and loss. Paying designers is an expense that she must weigh against other expenses and justify in terms of relative profitability. How do YOU handle this question? How do you measure the impact of DESIGN? Do you?

Or is that, ultimately, the wrong question?

Image Courtesy of Wessex Archaeology.

The Ecstasies of Metal

Learn from the mystics is my only advice. – Roxy Music [misheard]

opeth super metal mages and spiritual conduit to other dimensionsA friend suggested that I write a review of the Opeth show I attended on Saturday, May 2, 2009. I find myself quite incapable of doing so because, frankly, I cannot judge their music objectively or provide an accurate recounting of their performance.

This inability stems from the fact that my experience of Opeth was not primarily aesthetic in nature. Rather, as has been the case with the best metal shows I have attended, my experience in the presence of these masters of the art tended more towards the mystical/ecstatic realms of human consciousness.

Indeed, my most immediate memory of the show finds me in a state of frenetic, possessed movement accompanied by an ego-annihilating oceanic feeling. I give Opeth credit for inducing this state, a thing they accomplished via a sometimes subtly, sometimes savagely evolving rhythmic intensity coupled with serpentine melodies, strange words, and the trance-inducing repetition of droney, modal patterns.

Through its deliberate and complex structures – not to mention the aggressive amplification of sound and hypnotic manipulation of light – Opeth’s music invited the listener to become lost in its labyrinth.

However, it was not an all-devouring minotaur that awaited it us at the center of these intricate and winding passages. It was, instead, a refreshing, liberating, and, dare I say, “communal,” transcendence.

For all who seek the fortuitous and often unexpected profane illumination sometimes afforded by the marriage of technology and spirit in this post-everything age, I recommend that you seek out Opeth and especially the public display of their conjurings.

Image Courtesy of deep_schismic.

The Irony of Authenticity and the Authenticity of Irony

authenticity and social mediaSeems like nowadays, authentic is the thing to be.

Mitch Joel calls authenticity, “the cost of admission” in the Web 2.0 world, though he warns: “Being authentic isn’t always good. Let me correct that, being authentic is always good, but the output of being authentic [ie, revealing your flaws, shortcomings, and “warts” – Matt] is sometimes pretty ugly.”

HubSpot TV called the “marketing takeaway” of a notorious scandal involving a company paying for positive online reviews: “Be authentic. If not, you will get caught.”

When CC Chapman was among the Twitterati recently profiled by the Boston Globe, one of his Facebook friends asked, “Ever wondered why you have such a following?” He responded, “I wonder it all the time actually. I asked once and the general theme in the answers was my honest approach between life, family and work when it came to sharing things.” To which another friend replied, “Exactly right CC. You don’t try to be someone you’re not. It’s that authenticity that attracts people.”

Among the first to identify this flight to authenticity were James H. Gilmore & B. Joseph Pine II, who wrote Authenticity: What Consumers Really Want (2007). What notably separates them from contemporary partisans of authenticity is that their take is tinged with irony, an irony most evident in their promise to define “how companies can render their offerings as “really real.”’

This irony is refreshing because invocations of authenticity regularly fail to acknowledge or appreciate what is inherently contradictory about the concept. Said failure begins with the mistaken equation of authenticity and honesty (see above). Honesty may be a characteristic of an individual, but it is not a characteristic of authenticity. For example, an authentically honest person is being “authentic” when she is being honest, but an authentically devious person is being just as authentic when he is lying.

Similarly, we don’t call a painting an “authentic Rembrandt” because it is honest; we call it authentic because it was really painted by Rembrandt, unlike the forgery which only looks like it was painted by him. In other words, we call it authentic because it is what it seems to be. Herein lies the essential contradiction of authenticity: Authenticity isn’t about being real; authenticity is about really being what you seem to be.

The centrality of “seeming” to authenticity becomes even more clear when we call a person “authentic.” Such a designation usually means, “the way this person acts transparently or guilelessly reflects who they really are.” Because our sense of their authenticity depends on an assessment a person’s behavior, we should pay special attention to the fact that authenticity is performed; as paradoxical as it may sound, authenticity is an “act,” in the theatrical sense. (Which is why I always say, “Be yourself. It’s the perfect disguise.”)

The bigger problem though, is that our notion of authenticity assumes we really know who someone is and likewise the imperative to “be authentic” assumes we know who we really are.

Our identity, “who we really are,” is always contingent, provisional, and changing. It is an amalgam of who we want to be, who we mean to be, who we’re supposed to be, who we have to be, and who we are in spite of ourselves. Moreover, no matter how much we’d like to think so, we are not the authority on who we really are since it includes much that cannot be known by us. Indeed, and again paradoxically, we can’t know anything about ourselves without assuming the perspective of another, that is by identifying with someone else and precisely NOT being ourselves.

Just as one must consult an expert to determine the authenticity of a treasured heirloom – it can’t speak for itself – we can’t call ourselves “authentic;” that is for others to decide. At best, and this is the irony, we can always only strive to “seem” authentic. True authenticity calls for acknowledging that “who you are” is an open question and, moreover, a collaborative work in progress.

In the end, we must distance ourselves from our claims or pretensions to authenticity. We must call it into question and even suggest, especially to ourselves, that it may just be a ego-driven pose. (Hey, it just may be!) This distancing, implicitly critical and potentially mocking (or at least deprecating), is the classic stance of irony. And though the dodginess of irony (“did he mean that or didn’t he?”) seems to put it at a distinct remove from authenticity (“this is exactly what I think”), it actually mirrors the open-ended, unresolved, and ever-changing “dodginess” of reality itself.

Which is to say that irony, as a posture, an attitude, and as an approach, is more authentic (in the sense of “really being the way reality seems to be”) than honesty, sincerity, openness, or any of the other qualities that pass for such. The tragedy (or irony) is, however, that it will always seems less than authentic due to the all-too-human suspicion of ambiguity, indeterminacy, uncertainty, and, lest we forget, the wily intelligence native to irony and the ironist.

Image Courtesy of Mary Hockenbery.

More Thoughts on Design Thinking

Pull a thread on the Web and it unravels the universe. Having accidentally stumbled across the concept of “design thinking,” I found that there was a whole, thriving discourse on the subject. Who knew? I wrote a brief series of posts on my discoveries. This was the third and was originally published on March 14, 2007.

design thinking and the evolution of creative work according to David ArmanoI’m a latecomer and a slow learner.

My thoughts on design thinking began as a reaction to something written by Dan Saffer of Adaptive Path. Little did I know as I was penning my post entitled, “Thinking about ‘Design Thinking,'” that that self-same Dan Saffer had written a post with the exact same title almost exactly two years ago! That article includes a helpful stab at defining the characteristics of design thinking, “if there is such a thing,” as he wrote way back then.

One characteristic is “Ideation and Prototyping” – “The way we find … solutions is through brainstorming and then, importantly, building models to test the solutions out.” Actually making things to see if they work or solve the problem at hand is key to designing anything – hence his lament as he sees design schools move to an overly conceptual notion of design thinking, one that neglects craft and making and, ultimately, produces designers that can’t.

Oddly enough, I found Saffer’s earlier post in a rather roundabout fashion. The first event in this twisted chain came in the form of an email from David Armano, whom I had name-checked in my previous post. He pointed me to a post on his blog concerning the evolution of creativity in a decidedly inter-disciplinary and multi-dimensional direction. As an example of someone who embodies this emergent creativity, Armano referred to the site of one Zachary Jean Paradis, who graduated from the Institute of Design at the Illinois Institute of Technology.

What did I find on Mr. Paradis’ blog? You guessed it, a long, thoughtful essay on none other than “Design Thinking.” In fact, it was via this essay that I “discovered” Mr. Saffer’s earlier thoughts and my own intellectual tardiness.

Before I leave the topic of design thinking and return once again to more familiar ground, like Second Life, I will mention what I found most illuminating about Paradis’ perspective. First, he conceives of design thinking as an approach to “developing new offerings” which should not, to Mr. Saffer’s point, be equated with “professional design as it is taught.”

Secondly, because this approach is “purposeful,” he sees it as inherently integrative. He writes, “When developing some new offering with a team, members share the common goal of producing something contextually relevant.” The complexity of product/offering development, and the fact that the process must result in something that works in the world and meets definable needs of end-users/consumers, imposes the dual need for multiple disciplinary perspectives and their successful integration.

Finally, and as he says, “most importantly,” design thinking provides guidelines for collaborative work rather than prescribing a specific process for executing it. This kind of collaboration requires individuals who possess “a certain breadth and depth of knowledge of complementary disciplines,” precisely the new kind of “Creative” David Armano describes on his blog. Paradis ends his essay by insisting that, “… organizations must begin to recognize that moderately deep breadth is as important if not more so than deep specialization in addressing complex problems.”

To bring things more or less full circle, I think it bears stating that only by doing work on a series of increasingly complex and diverse projects, and not through schooling of any sort, can one acquire this “moderately deep breadth.”

Image Courtesy of dbostrom.

Design Thinking and the Serendipitous Web

This was the second of a brief series of posts that I wrote on the subject of design thinking. It was originally published on March 9, 2007.

I had never really thought about “design thinking” until I read the blog post at Adaptive Path that led me to write my last post. The funny thing is that as I started to research the concept, I noticed that earlier that same day I had bookmarked, obviously without much thought, a blog called Design Thinking Digest, which is maintained by Chris Bernard, Microsoft User Experience Evangelist and which I was introduced to via this post on David Armano’s blog.

As if it weren’t strange enough that the mighty and mysterious Web would bombard my subconscious with secret messages about “design thinking” so as to get me to write about it, today Bernard is blogging about the design approach of BMW’s Chris Bangle and, guess what? Mr. Bernard is very taken with the fact that when designing cars, Bangle focuses on “the doing.” He writes, “His teams get outside to look at the car, they craft and sculpt designs with their hands. They are constantly on the lookout for new ways that they can make things, they spend as much time thinking about not the actual creation but the TOOLS they use to create with too.”

That is, a critical component of true “design thinking” as practiced by a successful designer like Bangle and admired by an evangelizing software designer like Bernard is “doing” – getting your hands dirty, working with tools, making things. But that was, like, exactly the point I was “making” in my initial post on “design thinking”!!!

Is the Web reading my mind?

More frighteningly, is the Web writing my mind?

Thinking about ‘Design Thinking’

An article by Dan Saffer at Adaptive Path got me thinking about design thinking, which led to a series of posts on the subject. This post was first published on March 7, 2007.

design thinking and Adaptive PathI subscribe to the feed from Adaptive Path’s blog because, as they say here in Boston, the people who work there are “wicked smaht.” As a result, and thanks to the magic of RSS feedings, I spotted this impassioned plea from one of the Adaptive Pathers, Dan Saffer, for design schools to start teaching design again.

Saffer’s main complaint is that design schools have moved towards a curriculum centered around “design thinking” and away from a well-rounded, practical education focused on “thinking and making and doing.” In his view, the real work of design consists in the process of moving from concept to realization; stopping at the idea stage means you’ve only done the easy part. He writes, “Some notes on a whiteboard and a pretty concept movie or storyboard pales in comparison to the messy world of prototyping, development, and manufacturing,” and then puts a finer point on it by adding, “It’s harder to execute an idea than to have one…”

Having encountered this lament in one form or another many times – “No one understands good typography anymore;” “People try to design when they can’t even draw,” “They think the computer’s going to do it all for them,” etc. – that aspect of his argument wasn’t new. Rather, what drew my attention was the phrase “design thinking” and his characterization of it as “just thinking.”

Since I was pretty sure that it meant more than that, I did a little research and found a Business Week article from last October called, “The Talent Hunt,” which describes Mozilla turning to the folks at Stanford’s Hasso Plattner Institute of Design (aka, the “D-School”) in search of a strategy for expanding the adoption of Firefox. In light of Saffer’s comments, I was struck by the following sentences: “Business school students would have developed a single new product to sell. The D-schoolers aimed at creating a prototype with possible features that might appeal to consumers.” Likewise, in a lecture at MIT entitled “Innovation Through Design Thinking,” IDEO’s Tim Brown talks about the process they follow often involving “a hundred prototypes created quickly, both to test the design and to create stakeholders in the process.”

As I understand it, the “thought leaders” behind “design thinking” (you can find a good overview of them and their thoughts here on Luke Wroblewski’s site) advocate the application of design methods to problems of business strategy precisely because it places a heavy emphasis on prototyping and real-world pragmatics. If Saffer is correct that “design thinking” as taught in design schools is primarily about thinking, and not about making things and seeing if they work, then I would say the real problem is that they are not actually teaching “design thinking.”

But then again, I never attended design school. If you have, do you think that Saffer’s criticism rings true?

Image Courtesy of dsevilla.

Just a Moment

3044226914_b639b96df9_mWent to see a jazz trio called “Fly” last night: Mark Turner (saxophone), Larry Grenadier (bass), Jeff Ballard ) drums. Their performance reminded me how much I love improvised music played by intuitive and gifted people who know how to spontaneously combine harmonic complexity and dynamic subtlety with a searching and startling lyricism.

Just as we’re taught that a line contains an infinite series of points; music, for it’s part, shows us the infinite divisibility of time. The limits of this division are set, on the one hand,  by the frequency of tonal or rhythmic variation attainable by the musicians and, on the other, by the patience, attentiveness, and perceptual acuity of the audience.

Events apparently never exhaust the between of instants, which always allows for ever more vanishingly brief happenings. By contrast, a moment is not a measure of time, but a state of consciousness. Music, like the music I heard that night, ebbs and crashes around this moment of awareness causing us to ask not how soon is now, but how long?

Image Courtesy of overdrive_cz.

Is 4-D the New 3-D? Thinking about Photosynth

One thing that irks me about the 3-D world is that it’s hard to find things in it. I’ve often been looking for my keys or a book or a CD and wished that I could just open up a search box, type in the object of my fruitless and frustrating search, and instantly locate the darn thing. The fact that 3-D spaces can be difficult to search visually is one thing that stands in the way of the the 3-D desktop metaphor, IMHO.

Then I remembered Photosynth, a software that allows you to make 3-D models of places from 2-D images which, thanks to the magic of tagging, come replete with a conveniently searchable 4th dimension (raising the question: Is information, and not time, the 4th dimension?).

I first wrote about Photosynth on Aquent’s Talent Blog in 2007. Here’s the original post:

Visual Information, Design, and the Future

photosynthjp.jpgA friend of mine passed this link along to me. It is a video of a software demo at the TED Conference back in March. The speaker is Blaise Aguera y Arcas who was demoing two software packages – Seadragon, which is used to browse large amounts of visual data, and Photosynth, which organizes pictures into navigable, 3-D spaces.

This stuff really has to be seen to be believed. It represents the future of how we will interact with visual data and also highlights that we are already creating virtual models of the world we live in by uploading content to websites like Flickr. There is also a cool example of an explorable, high resolution advertisement for Honda. Imagine if a picture in a magazine contained the richness of data you could find on an entire website. Mind-boggling.

Microsoft acquired Seadragon back in February. Aguera y Arcas makes a funny comment about that when people start clapping at the amazing things he’s showing them. Have you ever attended a software demo where people burst into spontaneous applause?

Image Courtesy of Live Labs.

The Consolations of Conspiracy Theory

2876550480_fb353da796_mEver since I realized that there was an “official story,” on the one hand, and a very complicated, to some degree unknowable, and to some degree intentionally obscured, reality on the other, I’ve been interested in conspiracy theories. From Holy Blood, Holy Grail to Loose Change, from occult Nazism to the reign of the reptoids, I’ve consistently been amused, amazed, and disturbed by the fantastic proliferation of alternative world histories and astonishing speculation about who’s really running things.

I’ve always tended to approach these theories with a Muldaurian “I want to believe” attitude, but have also always been disappointed when I dug down into the details. While it may be true that I’ve never met a conspiracy theory I didn’t like, it’s also true that I’ve never met a conspiracy theory that wasn’t riddled with holes, hallucinations, and brain-rending leaps of (il)logic. Reading this stuff has frequently been edifying and even, in a strange way, inspiring, but it has never been convincing.

Although the truth is undoubtedly out there, conspiracy theories are not about the truth. Their primary purpose is to forge a semblance of order from the relentless rush and incomprehensible sweep of events on both the human and cosmic scale. Scientific discovery has unveiled a universe of overwhelming temporal and spatial vastness, the mass media continually inundate us with an unassimilable torrent of devestating reports from the bottomless well of human suffering, and the traditional (i.e., religious or mythical) filters no longer have the power to channel our experience into comforting or even remotely manageable frames of reference.

Still, the thought that our lives are “meaningless commas in the sentence of time,” or that we are blindly stumbling, ‘neath a protective veil of self-deception, through a labyrinthine vortex of genetically driven ego-trips and Nietzschean/Orwellian power-games devoid of exit or purpose, is for most of us unbearable. So we clutch at the straws offered by the conspiracy theorists (not to mention the good old news) because they tell us that, even if it is the Greys or the Illuminati or the CIA or the World Bank or the Council on Foreign Relations or the Trilateral Commission, or whatever, at least SOMEONE is in charge and everything is at least going according to SOME plan, as nefarious, diabolical, or alien as that plan may be.

The question is: Why are we consoled by the thought of SOMETHING, even SOMETHING MALEVOLENT, behind EVERYTHING?

Why, on the contrary, is the notion that all we are and experience arises from and inevitably returns to a primordial, entropic chaos – in other words, to nothingness – so difficult, even impossible, to accept let alone embrace?

Or is it?

Image Courtesy of Midnight-digital.

How Does Government Differ from Business?

1063260702_4d4a46d09a_mAs far as I can tell, the difference between Republicans and Democrats boils down to the following: Republicans think that government should be run by businessmen and Democrats think government should be run by lawyers.

I mentioned this once to a friend with Republican tendencies and she said, “That’s right. Government should be run like a business.”

My immediate response was, “But a government is not a business!” Which, of course, got me thinking about how governments and businesses differ.

For simplicity’s sake, I define a government as that organization responsible for establishing and maintaining order within set geographic borders, borders which it is also generally the responsibility of said organization to secure, if not necessarily establish.

By contrast, I define a business as a set of related processes which facilitate the delivery of a good or service within a larger macro-process of exchange which usually depends on an consensually accepted token of value (currency) and a set of rules enforced by a communal agency (which may be a mob or may be a government).

Now consider these definitions in light of Allen Weiss’ comment that most Web 2.0 “business geniuses” seem to ignore “what a business is supposed to do..namely, make a profit.” On the one hand, I find in this formulation one important differentiator between government and business: Making a profit does not enter into my or any definition of government or its purpose.

On the other hand, I must point out that I did not define business in terms of making a profit either. This was intentional because I do not believe that the purpose of any business is, in the first instance, to make a profit. Aside from delivering the good or service around which it is organized, the main purpose of any business is TO STAY IN BUSINESS. Making a profit may serve this end, but it is not an unqualified necessity.

Now, returning to the original question of government and business, would it be fair to say that the purpose of any government is to stay in power?

Image Courtesy of takomabibelot.