Creativity is Not a Scarce Commodity
(American Behavioral Scientist, forthcoming)
Howard S. Becker
Abstract. “Creativity”—an original action or object—occurs frequently. Usually praised in the abstract, it’s seldom recognized or rewarded in ordinary life and in the workings of organizations. Seen in organizational contexts, it can be understood as an interesting activity whose potential value organizational constraints make it impossible to recognize for what it is.
Artists prize creativity, art lovers seek to buy it or experience it, institutions work to foster it, business executives would like to organize and control it so that their enterprises can flourish.
Everyone is interested in creativity. Everyone supposes—it’s built into the idea—that creativity is scarce. It doesn’t come around every day. But, since it’s a good thing, it needs to be nourished and given special care, so that we can have more of it. Experts give seminars in how to do that. Consultants advise clients on the special methods needed to re-organize their enterprises so as to encourage creativity. Schools have special programs for the gifted and talented.
I’ll assert the opposite. Creativity occurs everywhere, all the time. You can’t go anywhere without tripping over it. To experience all the wonderful results it is supposed to produce we just need to get out of its way. And, as a corollary of that, I’ll assert something else that will require a little more explanation: that creativity is contextual.
What is “creativity,” exactly? We ordinarily call an action or an object creative when we think it’s unusual, different from what other people would do or produce given the same problem to solve. We also mean, though we usually state this thought less explicitly, that the idea is not foolish or silly or unworkable, at least not in principle. When ideas seem silly or foolish or unworkable to us, we don’t call them “creative.” We use some less approving word, like “dumb” or “weird” or “useless.”
That identifies the word “creative” as an honorific title, a classy way of saying that the object or action or idea is good in some important way other things more or less like them aren’t. Calling an idea or an object or a person “creative” means we think they are good in a way others, at least superficially similar, aren’t. Because we approve of things we call “creative” we treat them better than things we don’t think merit that title.
What is the creative way of being good that we so approve of? The dictionary defines “creativity” as originality, expressiveness, imaginativeness—all obviously good things anyone would want to be or want the objects they make or own, or the ideas they propose, to display. Such words imply that these things embody uncommon qualities. We don’t usually think of something produced routinely, every day, as original or special. When we use a word like “creative” to describe something rare, we imply that it is also valuable, both in the mundane financial sense and in the more abstract cultural sense.
But there’s more to it than that. The way people use these approving terms in ordinary life reveals some incongruities. We typically treat playing an instrument in a symphony orchestra as an extremely original and creative way of making a living, although many, if not most symphony players testify that it is, in fact, extremely repetitive and boring work. How many times can you play the nine Beethoven symphonies “creatively”?
Improvisation, which many people hold up as the epitome of creativity, in fact consists (as Paul Berliner’s (1994) definitive analysis of jazz players’ solos demonstrates) in large part of knitting together small musical fragments the improvising player knows well and has practiced endlessly. The combinations may be new, and thus original and creative (though often enough they aren’t), but the fragments being so combined aren’t new at all. Rodin’s sculptural practice often included re-using the same parts—hands, feet, etc.—in a variety of pieces.
Many people will feel that such criticisms are carping, small details that don’t disturb the standard assumption that creativity is just what they think it is. They might say, if I can speak for them, “Well, okay, maybe individual parts aren’t creative, but the combination, the putting together, is, combining elements that way, the way artists make collages of fragments they cut out of newspapers or magazines and surely did not make themselves.” The kinds of considerations I want to bring up now will, I hope, show that these objections are not important to the argument.
What I’ve said so far suggests that we are dealing here with yet another instance of the power of labeling, in which the way we talk about things gives them attributes which, viewed from another perspective, they might not have. When the way we name things isn’t automatically given in nature—and, of course, it never is—we know that we are discriminating among and classifying things we could have classified differently. Which suggests to a suspicious sociologist that the label of “creative” is being applied in what we can call, in another sense of the word, a discriminatory way to things that don’t differ much in their origins and intrinsic characteristics. How that happens and what consequences it has raise fundamental question of social organization.
The idea of labeling has informed sociological thinking, under one name or another, since W.I. Thomas said that if we define situations as real they are real in their consequences. Sociologists have used the idea of labeling to analyze such varied phenomena as deviance, art, and science. They have argued about the idea and the phenomenon—about the problems of defining it, about its importance, about the relation between labels and reality, and similar problems—for many years.
Everett Hughes made an important addition to the idea in his essay on “Bastard Institutions.” He said that people (for which we can read, as is usual among students of Robert E. Park, organizations, institutions and other forms of collective enterprise as well as individuals) can “deviate” from some norm or standard in two directions, though we ordinarily only interest ourselves in one. In many of the situations in which labeling has been invoked as a mechanism, analysts have only been interested in negative deviation—behavior and results worse than what we expect and want, hope for, or insist on, and thus as things to be avoided, controlled or, even better yet, gotten rid of altogether. Hughes referred to this as deviation in the Devil’s direction. And n he went on to remind his readers that deviation could go in the other direction as well, what he called the saintly direction. That is, people and organizations could act worse than we wanted them to, but they could also act better than we wanted them to.
He was quite serious when he said that people could be “better than we wanted them to be,” pointing out that saints often create problems for institutions which find themselves stuck with one. The Catholic Church has now and then thought saints problematic and even convicted some of them of heresy, changing its organizational mind later on. Joan of Arc, burned as an indigestible heretic by the Church in 1431 became a saint in 1920. Just, as Hughes said, managers of all kinds of organizations find their subordinates adhering to rules and customary behavior in situations where their conformity creates difficulties that could have been avoided had they only recognized the importance of being a little flexible.
I bring Hughes’ distinction up because I think it likely that what we, from a different standpoint, might call creative often makes trouble by being “too” creative, too different, not easily assimilated by the organizational apparatus already in place to deal with the category its products belong to, and thus not entitled to such an honorific title as “creative.” Only a short distance separates “creative: from “pain in the ass.”
Contexts and the contexts of labeling
The acts and objects I have been talking about, and the definitions people attach to them—as better or worse than we would like them to be—don’t occur in a social vacuum. Not at all. It’s obvious that creativity occurs, rather, in a social context, and that a major feature of that context is the social ranking of the people making the innovative gestures and things which might under some circumstances be labeled “creative.”
How do we know that an act or object is creative? We compare it to other acts or objects “of the same kind” done or made by others. But classification is always and necessarily a socially defined and embedded act—not to be too fancy about it, we make classifications in the light of what they will let us do in the social organizations we work and live in. So to talk that way requires us to ask who makes judgments about what things and which people are creative, on what basis they make those judgments, and in what kinds of organizational contexts they make them. What reasoning do they employ? What arguments do they make? Who accepts these arguments, and on what basis? How do all these actors collaborate to produce the consequences of such classifications?
Who makes these judgments? First of all, judges who have been appointed, or who have appointed themselves, to do just that. In schools, teachers usually make the judgments. In worlds of art, it’s often critics or customers but also, more pervasively and constantly, colleagues. Things seem the same in worlds of science too, but science adds another judge, the external world in which events either do or don’t support, at least as conventionally accepted research instruments can reveal to those who interpret their results, the judgment colleagues will have to make.
Just as important, and in many ways most important for anyone who wants to promote creativity, people judge their own activities and productions as original, imaginative, and creative or, in the other direction, as stupid, silly and useless.
I said earlier that if we want to encourage creativity, we should just get out of its way. That was an ambiguous statement and I’d like to clear up some of the ambiguity, insisting that creativity is not rare at all. That becomes clear once we identify the obstacles. organizational and personal, that get in its way.
Is creativity really scarce?
Here is where the judgmental processes involved in labeling I invoked earlier do their work. Ordinary observation shows us that what is scarce is not the fact of creativity—of some kind of activity unlike what others have done before—but rather the activity of labeling something “creative.” If we look around us in the most ordinary situations of daily life we see people being creative—doing original things no one ever did quite that way before—all the time. Once we separate the originality of an idea or an action, as seen from an unbiased viewpoint, from the judgment others make of its originality and creativity, we can look for expressions of creativity everywhere. And we’ll find them.
Where should we look? In all the places where people do work that others, and especially those empowered to make consequential judgments, don’t evaluate highly, or where the people who do the work are themselves not highly valued, because they belong to racial, ethnic, class, gender or other groups the judges don’t value highly. Conventional judges, working in conventional organizations, may well classify whatever such workers do as ordinary, certainly not creative or original, because that entire category of work or, alternatively, any kind of work done by members of those social categories, conventionally falls into the category of “uninteresting” and therefore essentially incapable of generating creativity. If the problems those people deal with in their work aren’t “important,” no solution they create can deserve the label of “creative.”
Here are some obvious examples.
The home kitchen. Marjorie DeVault’s study of how housewives plan and make meals for their families shows these women finding unusual, imaginative and, yes, creative solutions to a complex problem of optimization: how to generate menus that satisfy the many conflicting requirements of their (home) work situations. They must satisfy their family members—ordinarily husband and children, though perhaps others—with all of their idiosyncratic likes and dislikes: this one won’t eat red meat, that one hates eggplant, a third doesn’t like any vegetables at all, and on through a potentially endless list. And they can’t just serve the same few things the family members do like over and over again, because that bores everyone. At the same time, they have to deal with two realities of a different order: what’s available in the markets at what price, and how much they have available to spend on meals. All these requirements and constraints are elastic, but not infinitely so, and it often takes great ingenuity to concoct a week of meals that satisfies all these requirements.
These housewife/problem solvers act, for the most part, alone. They have learned how to cook, in varying degrees of complexity and sophistication, from a variety of sources over the years, and they get more immediate help from cookbooks and television shows, newspaper food sections, and friends and relatives. But, in the end, it’s them in the kitchen with dinnertime looming and a menu to create. These cooks routinely solve their problems—what other choice do they have?—and we shouldn’t be surprised that they often do that in ways most people would agree are creative. Not all of them, of course, and not all the time. But that’s also true of the artists and scientists who serve as our implicit models of creativity. Though it’s not often acknowledged, they too produce routine stuff that’s good enough for the circumstances, but not necessarily original.
The people who eat these women’s meals—the family for whom the meals were prepared or occasional guests—deliver judgments, of course, that help shape the cook’s later efforts. But no one gives Nobel prizes in family feeding, no matter how original or creative the cooks’ solutions to these problems. No one gives “genius awards” to these inventors. Not even James Beard Awards for creative cookery. Their creativity goes unremarked and does not provide the subject matter for studies in the field (although culinary critics of course will treat similar experiments by well-known chefs with awe and reverence). Conventional thinking does not imagine that women who aren’t specially trained and educated can be creative, and some people still think that women are simply, perhaps genetically, incapable of the kind of unusual thinking that merits the word “creative.”
Industrial settings. Industry and business often, in contrast to examples from the home, serve as the setting for searches for and possible discoveries of creativity. But the examples I’m going to describe don’t show gifted engineers and managers concocting ingenious solutions to problems identified by company executives. They demonstrate, rather, how workers lower in the organization—members of skilled trades, even workers less skilled than that—apply their ingenuity to the problem of dealing with their bosses, who want to get more work done while paying less for it (from a management point of view that might be called solving what managers see as a “productivity problem”).
Donald Roy (1952, 1953, 1954) studied a WWII factory, whose bosses sent engineers to time how long skilled machine operators took to do certain operations, and then to create ways of “rationalizing” those actions so as to get more work done in the same time without paying the workers more. Workers ingeniously created collective strategies to counter those efforts. But no management expert would ever describe their inventions as “creative,” though they were often original and unexpected combinations of known techniques and new possibilities resulting in inventive operations that let workers give the appearance of doing what was wanted while ensuring their continued ability to ensure what they had decided for themselves was a “fair wage.”
The apparent incongruity of seeing the machine operators’ inventiveness as creative suggests that organizations reserve the possibility, let alone the right, of being original and creative for occupants of positions above a certain status level. They won’t say that in any of the documents defining the rights and responsibilities of different organizational ranks, but that unstated assumption is usually there. Machine operators did not solve problems worth solving in the way their supervisors and more highly-placed executives did. None of the latter—who controlled assignment of the prestigious label of “creative” or its synonyms—would see anything the workers did as “creative,” no matter how original it was. At best, I suppose, they might see that the workers’ solution was ingenious, but a solution to a problem not worth solving, if not a problem in itself.
So only the right kind of person, defined by the environing organization as the kind of person who can be creative, can find creative solutions, and the solution has to be to a problem seen by the people who run the organization as worth solving. But that simplifies what is going on. If “creative” things can only be done by someone of the appropriate rank any really inventive action has to be ignored, downgraded, or stolen and only then identified as “creative.”
There’s an interesting parallel here to sumptuary laws, which regulated what kinds of clothing, made of what materials, people of different social orders could wear. In this case we have informal codes regulating who can possibly be creative and about which things they can do that.
Furthermore, only certain problems can be officially recognized as existing and needing a solution. It’s not only people at the bottom of the organizational hierarchy who see and develop ingenious solutions to problems not officially defined as problematic by top management. Melville Dalton (1955, 194-217) made an insufficiently recognized theoretical breakthrough when he identified what managers ordinarily defined as “employee theft” as an informal reward system, which he explained like this. Suppose a company vice-president wants someone to do something that he can’t reasonably or legally ask them to do: to build, for example, on company time and with company tools and materials, a large birdhouse at his home (one of Dalton’s gaudier examples). He can’t order a company carpenter to do that and he can’t pay him with company funds. But he can let him, by turning a blind eye, “steal” something of considerable value, say a pile of expensive lumber, in payment. Are they stealing? Dalton explains that, yes, strictly speaking, they;re stealing but actually the payoff is a reward for a “favor.” When they take things they aren’t entitled to it’s generally in return for some favor like that. In one of his more illuminating cases, he describes a foreman who shows such promise that his bosses would like to promote him to a management position. But in that organization, in that day and age, there’s a problem. All the managers are Protestant and this promotable foreman is Catholic, and the Protestant managers aren’t sure they can trust a Catholic. But, if he changed his religion, they would feel he was more trustworthy. Well, you can’t officially require someone to change his religion to accommodate his superiors’ prejudices. But you can make it clear that, if he were to do that, he’d be entitled to certain things he’s not officially entitled to. He’d be rewarded for his flexibility in this matter, in ways that leave no bureaucratic record.
The people involved in such negotiations make real contributions, one way and another, to the smooth running of the organization, and they invent methods and schemes we can reasonably call creative. We could go further and consider, more generally, that every organization has to solve problems it can’t admit it has, and that no one in the organization will be able to recognize these creative solutions because doing that would require admitting that there had been a problem to solve.
Many of the management follies of recent years consist of similarly ingenious inventions, which would surely never have been allowed if someone of lower rank in the company had proposed them but which were not only allowed but even defined as highly creative contributions when the innovative financial schemes came from the CEO and other high officers of the company. At the time, Enron executives, among others, were praised for inventions which solved immediate financial and managerial problems but eventually led to the company’s ruin and prison sentences for the inventors (McLean and Elkind, 2004).
So creativity seems scarce because, even when it’s there to be seen, no one can allow themselves to recognize it officially. The managers who gave informal rewards to workers in the businesses Dalton studied knew very well what they were doing, but couldn’t say so in any official way. Since what they were rewarding didn’t exist officially, they could hardly recognize it as a form of inventiveness.
But there’s another set of reasons why creativity, though discoverable everywhere, seems scarce. And that’s because it is scarce. We can distinguish two kinds of creativity here. One kind consists of an original idea or scheme, something someone thinks up that others haven’t already thought up, and another kind that is not only dreamed up but is then pursued, turned from an idea into a plan, a finished prototype, an idea fleshed out into a working plan which is then built, an organizational invention which people in the organization actually implement, an artist’s vision that becomes a finished work.
Creativity is scarce because of censorship. Not in the usual sense of that word—the cops closing down a strip show, or some government official collecting and burning politically offensive books—but rather in the sense of “discouragement,” of telling those who have creative ideas that the ideas aren’t really interesting, that they aren’t sensible, that they’re a little (or more than a little) crazy, and suggesting that it would better just to forget about them. “Nice try, but no cigar,” captures this response to unusual ideas.
All the reasons for ignoring unusual solutions to problems operate because they are based in organizational realities. “It’s not practical,” often given as a reason for not taking up some original or unusual way of doing things, means that the idea will run afoul of organizational agreements and solutions that satisfy the welter of interests the organization’s activity must satisfy. Where the “must” comes from is always an important question, one that it would be fruitful to pursue.
Creativity is also, and perhaps even more pervasively, scarce because of self-censorship. In many situations, people with creative or original ideas say to themselves, “This is not worth pursuing any further, no one else will be interested, I’m just wasting my time.”
“No one else will be interested.” That may or may not be true. A lack of interest may not mean that no one thinks the idea has intrinsic merit, but rather that no one thinks the merit sufficient to justify the trouble or the investment necessary to push the idea, to work on it and get it beyond the idea stage—to move it from a blip in someone’s brain to a finished product.
“I’m interested, but I’m afraid it might not work out, so why waste time?” Here too, the imagery of investment takes over and, calculating the possible profit and the possible loss, the innovator tells himself “The hell with it.”
I was in Rio de Janeiro during the Brazilian military dictatorship in the 1970s and a friend took me to meet one of his friends, a stage designer. The designer said he didn’t do anything remotely political any more because, as I remember his remarks, “What’s the point? You put a lot of work in on it and then the Censura closes the show the night before it opens.” This common response to censorship does the really heavy and brutally effective work of cutting off the flow of original ideas before they see daylight.
These negotiations with yourself don’t take as much time as we need to take a breath. As I watch myself, and as I listen to others describe their experiences when I push them about such things, it costs just about zero to say “No” to your own mind when it pops out an unusual idea. For many people, it’s second nature.
Of course, all the ideas your mind proposes to you don’t glow with originality. Many consist of commonplace fantasies about one thing or another that we know (or think we know) have no chance at all of ever becoming real, they’re just daydreams and that’s it.
But some might have a glimmer of something different, something that I could do, maybe, something that at least night warrant a little further exploration. Another ten seconds, another minute, might suggest some ways of actualizing this particular dream. All these stray ideas are not ideas for perpetual motion machines, or similar chimeras (“fanciful mental illusions”), they are ideas that aren’t that easily dismissed. I don’t want to get moralistic and say that they deserve a chance to shine, a moment in the sun, just that if you thought about them a little longer, turned them over and looked at them from several angles—there might be something there that you and others would find useful, interesting, valuable, even profitable.
For organizations, the lesson is even simpler, but harder to implement: encourage people to let their minds wander a little. We usually call this “thinking outside the box,” and everyone knows it’s a good thing to do. But few people do it. The price is too high. I will put this very simply. Organizations reject new ideas because their novelty runs afoul of the way things are done. Things are done the way they are done for good reason: just as a housewife’s dinner menus take into account her family’s idiosyncratic demands and her financial resources, an organization’s solutions to its problems must take into account the accumulated compromises and prior solutions to still other problems that have solidified into “how we do things here.” Any change will likely upset some of those, and that’s usually enough reason to not do it. Dickens immortalized this organizational failing in his portrait, in Little Dorritt, of The Circumlocution Office, a government office whose expertise lay in “in the art of perceiving—HOW NOT TO DO IT.”
I had my own experience with this when my colleagues and I wrote a book embodying our findings from three years of research on the experiences of medical students. When we met with the eight members of the school’s faculty who were interested in our results they had one big complaint: where were our recommendations? They had gotten used to researchers whose final report told them how to make their school better. I said that, not being in the business of educating medical students myself, I didn’t have anything special that I wanted the school to accomplish, so had no recommendations. But, I said, if you tell me what you’d like to change I think I can tell you how to change it. After giving the point some thought, someone said he was upset because, as we described and as he knew, students typically studied for examinations by memorizing a lot of stuff, regurgitating (their word) it on the test, and then forgetting it. I said that probably was a rational response to the kind of written tests they gave, and asked what he wanted students to know. “To interview a patient and take a medical history, perform a thorough physical examination, order appropriate laboratory tests, make a diagnosis, and work out a plan of treatment.” I immediately said I had the solution to their problem: have each student do all that with two randomly assigned patients and then go over the patients and check the students’ work. They all looked uncomfortable and one finally said that that wasn’t practical. Why not? “Well,” they said, “that would take a lot of time.” I agreed that it would but said that would accomplish what they wanted to accomplish. They persisted, explaining that they after all had their own patients to take care of, their research to do, their administrative duties in the hospital and school. They made it clear, though they wouldn’t have accepted my language, that what they wanted was a panacea, something that would get rid of everything they didn’t like without changing anything they did like. And they liked a lot of things, more than they disliked the other things, so nothing was going to change as a result of our research.
That’s unfortunately the fate of most social science discoveries about creativity. Still, we have to recognize that innovation does occur in organizations, that people do come up with wonderfully original objects and ideas in spite of all the obstacles to that I’ve mentioned.
How do they do that? Easy to say, hard to implement. For individuals it’s not hard to see how to do it. You give up some or all of the rewards available to people who do what is currently being rewarded. Many artists have done that and many more do it all the time. Some of them succeed in having their creativity recognized by conventional institutions: the big theater in town does a very original play by an unknown playwright, the opera does a new and unusual work making great demands on everyone involved. Most of them don’t succeed, but the arts are organized in a way that allows a few things that are “different” to get through.
Large organizations have less success, though here and there inspiring fairy tales appear. My favorite is the man who established a pirate enterprise on the campus of the company he had founded, dedicated to making a totally impractical personal computer that the executives of his own company had rejected. Steve Jobs and the Macintosh, a story too well known to need retelling, but one whose lesson is hard to accept.
Berliner, Paul F. Thinking About Jazz: The Infinite Art of Improvisation. Chicago: University of Chicago Press, 1994.
Dalton, Melville. Men Who Manage. New York: Wiley, 1959.
DeVault, Marjorie L. Feeding the Family: The Social Organization of Caring as Gendered Work. Chicago: University of Chicago Press, 1991.
McLean, Bethany, and Peter Elkind. The Smartest Guys in the Room: The Amazing Rise and Scandalous Fall of Enron. New York: Portfolio Trade, 2004.
Roy, Donald. “Quota Restriction and Goldbricking in a Machine Shop.” American Journal of Sociology 57 (1952b): 425-42.
Roy, Donald. “Work Satisfaction and Social Reward in Quota Achievement.” American Sociological Review 18 (1953): 507-14.
Roy, Donald. “Efficiency and the ‘Fix’: Informal Intergroup Relations in a Piecework Machine Shop.” American Journal of Sociology 60 (1954): 255-66.
Howard S. Becker lives and works in San Francisco and Paris. His most recent book is Evidence (University of Chicago Press, 2017).