To start, let me admit up front that my title is a bit of a bait and switch—I’m not really going to talk about doing history without any humans whatsoever, though I think I can at least point the way towards something like that. Remember Eric Wolf’s book back in the 1980s, Europe and the People Without History? Great book, but I’m waiting for someone to write, Europe and Its History Without People. I’m not entirely sure yet what that would look like, but who wouldn’t want to read a book that doesn’t have any Europeans in it? It would almost be as good as a book that doesn’t have any Americans in it!
I jest, of course, and let me just hasten to put on the record here that, no matter what I am about to say, I myself hold humans in the highest regard. I can honestly say that some of my best friends are human. But when I propose to write history without humans, I really mean history without “humans” as they’ve usually been understood by “humanists” and a lot of other folks since at least the 19th century. Which is to say, humans as creatures above else of mind and culture, of self-invention, the ubermannian Nietzschean type of humans who have transcended the mere material world, the anthropocentric humans, the “what a piece of work is a man, how noble in reason, how like an angel” humans. Put simply, the itch I propose to scratch is this: Can we write more histories in which our human stars, whose Broadway run has now stretched into several thousand years, can step back to let their supporting cast take center stage? And even if we can, would it be wise or worthwhile to do so?
My answer to both of these questions is an emphatic . . . maybe. I have a lot of reasons for answering that way—so many that I’ve done what humanists tend to do and wrote them down in a book. It’s called The Matter of History, and its hopefully coming soonish to a fabric-draped folding table near you. Please buy it.
Today I’m going to talk about two of those reasons in particular that offer some support for writing non-anthropocentric histories. There not necessarily the best reasons—you have to read the book to get those—but they are the ones I thought I could squeeze into a 20-minute talk. They are:
First: Does a New Human Demand a New Humanism?
Second: Have We Underestimated the Creativity of the Non-Human World?
I know—those are just very obviously leading questions, not “reasons,” but I’ll bet you can guess what my answers are going to be, so it’s close to the same thing.
Let’s start with:
Number One: Does a New Human Demand a New Humanism?
That being human is not everything it was once cracked up to be is pretty obvious. Seems like every time I turn around, which is quite often, folks in both the sciences and the humanities are saying that we have to rethink what we mean by the word “human.” When the very-big-science Human Microbiome Project revealed that 90% of the cells in our bodies are not human but bacterial, most of them living in our guts, the director of the National Institute of Health famously observed that, “We are more microbial than human.” It turns out that the 90% figure was almost certainly off the mark. But let us not quibble: there are a disgustingly huge number of bacterial and viral cells living on and in us. Moreover, they don't just help us digest our food, but they shape how we feel and think. Most of your serotonin, roughly 95% of this wonderful “happy hormone,” is manufactured by your gut bacteria. No wonder, then, that one researcher concluded, “much of what makes us human . . . depends on [the] metabolic activity of these microbes.”
So, if your microbiome doesn’t like my paper today, that’s hardly my fault. We know who’s really to blame—and the funny thing is, that it may not be either “me” or “you.”
Or take the revolution in epigenetics. Now I was taught in college that my genome is impervious to most environmental influences during my lifetime—an immaculate conception, if you will, and there was no greater biological sin than Lamarckianism. But now the new epigenetic theory tells us that environmental factors can actually turn parts of our genetic code off and on, rather like a microscopic genetic light switch that influences both the biology and the behavior of living organisms in “real time.” More controversially, some evidence suggests these epigenetic influences might be heritable. One study in mice found that extreme environmental stress experienced by a parent, caused epigenetic changes that could be passed on to their children and even grandchildren, who ended up with unusually high levels of behaviors that looked very much like what we would call depression. I know I’m not supposed to call mice babies “children”—but, based on these mice studies, some researchers have made an admittedly big, but not unreasonable leap, to argue that similar epigenetic mechanisms may help explain the well-known observation that the children and grandchildren of Holocaust survivors often suffer from unusually high levels of depression. Such trans-species speculations aside, one geneticist recently observed that we are now beginning to understand “how the environment gets under the skin to affect gene expression, and consequently, neural activity and behavior.”
Or consider the recent shifts in cognitive theory and linguistic theory that have, once and hopefully for all, banished the old idea that an abstract human mind is clearly separated from its body and the environment. Increasingly, it seems apparent that our thought and language is much more closely tied to our embodied sensory experience of the material world than we had imagined—certainly WAY more than Saussure and Baudrillard had, or even could have, imagined—may they rest in peace. Humans certainly do “culturally construct” the world around them, but they do so first as embodied sensory creatures whose language and thinking is embedded in the very world that they construct. The consequences, in the words of the cognitive linguist George Lakoff, are again profound: “As a society,” Lakoff argues, “we have to rethink what it fundamentally means to be human.”
I could go on, but my point is there’s a whole lot rethinking going on these days about what it means to be human, from a whole lot of different fields. And the gist of it seems to be that humans, past and present, are much more embedded in our material environment than even environmental historians had previously realized. Not, just in terms of health and the physiological body—topics environmental historians have brilliantly developed—but also even in the ways we feel, think, and manipulate abstract ideas. If we take these insights seriously, we don’t think about the material world so much as we think with and through that world—in some cases, a world that is literally inside of us. Whomever or whatever is doing the thinking here, has already been so deeply shaped by their environment that it gets pretty hard to tell the difference.
So—it seems fair to ask, what are we humanists to make of all this rethinking of our central subject of study?
But wait, there’s more. At roughly the same time we’ve been realizing how deeply entangled we are in our environments, we've also been discovering that the environment in which we’re entangled, is much more dynamic and creative than we knew.
Which brings me to my second question/reason:
2) Have We Underestimated the Creativity of the Non-Human World?
Now, obviously we’ve known that the earth is creative for a long time: after all, it did, belatedly, create humans, even if it may now be having second thoughts. But what’s new here, what we’re finally starting to figure out, is how this creative process works. One of the great unanswered questions of Darwinian evolutionary theory was the source of biological innovation: Darwin explained the survival of the fittest, but he couldn’t account for the arrival of the fittest. How did life manage to create novel proteins, enzymes, and other stunningly improbable molecular forms that helped species to survive and evolve? Consider this: there are more potential ways to put together the twenty key amino acids that make up living things than there are hydrogen atoms in the entire universe—that number is not just astronomical, it is hyperastronomical. Even over the course of millions of years, the odds that useful proteins would emerge randomly would be vanishingly small. For a long time, evolutionary biologists largely ignored this problem. But now a new breed of evolutionary developmental biologists—who get to go by the cool seventies-sounding name Evo-Devo—are starting to reveal the ways in which life can invent biological breakthroughs much more quickly and efficiently than we knew. Making an analogy to the American television show Star Trek, the Evo-Devo biologist Andreas Wagner explains that life has a sort-of “warp drive,” mechanisms for accelerating the speed at which organisms can create and try out new proteins and other biological molecules and metabolisms. In his recent book, The Arrival of the Fittest, Wagner refers to this as nature’s “innovability”—it’s innovative powers. His words, not mine.
Now, the point of this is not that humans are suddenly going to start evolving more rapidly—as much as it looks like we may need too, humans are evolutionary dawdlers, back benchers who rarely can play with the big kids. But, thank our lucky stars, evolutionary innovation can and does occur with astonishing speed in many of the simpler organisms with which humans have formed close partnerships. On a historical time scale, the sheer inventive power of the organic world affects humans most directly through legions of other more creative organisms, like the bacteria in our guts, the weevils that eat our crops, and the little white worms that make our silk, to mention only a few. Bacteria reproduce so quickly and have such efficient means of mixing genetic information that they can evolve useful new traits in mere days or even hours. Given this, we might think of humans as surrounded by—and colonized by—other simpler, but arguably far more biologically innovative creatures, the speedy architects of a sort of “alien technology” which has both benefited and harmed us. Either way, as Ed Russell has convincingly argued, this is often a process of historical co-evolution, in which both sides play a role in creating a novel outcome.
Nor is creativity solely a property of living things. Even relatively well-understood and seemingly predictable molecules can do surprising things when they interact with a big dynamic world—let’s take, just for the heck of it, molecules of a gas like carbon dioxide. Humanists and scientists alike tend to understand phenomena like global climate change as the unanticipated consequence of human actions—eco-system accidents, if you will. But what we call accidents, can just as well be understood as evidence of a creative world that humans understand only vaguely and guide only imperfectly.
Technology, which we should of course understand as nothing more than repurposed nature, demonstrates similar innovative powers. As the historian of technology W. Brian Arthur puts it, “Technology builds itself organically from itself” as “it is a fluid thing, dynamic, alive, highly configurable, and highly changeable over time.” Humans are certainly the keystone species in the evolution of these technological ecosystems, yet our contribution often consists of combining existing technologies and things in novel ways. Light bulbs popping in our heads notwithstanding, neither Edison nor anyone else conjured up new inventions solely out of their own febrile imaginations. Because our inventions are never truly just “ours,” because they remain more wild than tame, they always have a creative potential that we exploit more than make.
Again, I could go on, but I’m running out of time. Given all this, I think there’s a pretty good case to be made that we live in world not of passive raw materials, compliant technologies, domesticated organisms, or even “eco-system services,” to use that most appalling of modernist scientific phrases—but rather in a dynamic world full of innovative, creative, and even dangerous non-human things, both biotic and abiotic, natural and anthropogenic.
To move towards some sort of conclusion, maybe you find some of these points convincing, maybe you don’t—it’s a bit much to cover in 15 minutes. But just for the sake of argument, let’s pretend that both of these ideas are to at least some degree correct: In other words, that humans are more deeply embedded in their environments than we knew, and, that these environments are far more creative than we knew. So what kind of history would we end up writing if we put those two insights together?
It would, I suspect, be first of all a far less anthropocentric history, a history where we begin by looking for the historical signs of a creative and powerful material world of organisms, chemicals, technologies, buildings, and other things that also have a history, even if that history emerges most clearly only in their relationships with humans. Second, it would be a history in which we would frame human creativity, intelligence, power, and culture, as emerging from our interactions with a creative world, including interactions that extend all the way inside the bodies, brains, and minds of past peoples. We are creative because our world is creative.
To conclude, there surely is some cosmic irony in that at precisely the same moment we are learning so much that argues against an anthropocentric view of human history, the suggestion has been made that we should name the modern geological epoch after those very same humans—the Anthropocene. I understand the utility of the term, and I don’t question the good motives of those who advocate it. But if even only a fraction of what I have suggested here today is correct, it seems evident that the term conveys precisely the wrong message at precisely the wrong time, reinforcing the almost reflexive human tendency to anthropocentrism, just when we most desperately need to move away from it. Because I would submit we have not, cannot, and will not, ever live in an Age of Humans. Rather, we live in an age of coal and steel, of oil, we live in an age of cows, cotton and copper, an age of corn and rice, we live in an age of sulfur, of arsenic and asbestos, an age of diethylstilbestrol and bis-phenols, we live in an age of hard concretes and soft plastics and sharp shiny aluminum, and an age of bright electric lights that erase the infinite stars from our eyes. This, the supposed age of humans, is all of these things and many more, but never just human, because these are the very things that have made us human, the “quintessence of dust” from which we emerged only too quickly turn inward to seek divinity in ourselves. No, if we want to see the true human, we must tear our gaze away from ourselves and look instead at the things that have made us. Because like a faint evening star that you see only out of the corner of your eye, we see the human star most clearly only when we look away.
 Peter Andrey Smith, “Can the Bacteria in Your Gut Explain Your Mood?” New York Times (June 23, 2015).
 Quoted in Smith, “Is your body mostly Microbes?”
 Quoted in Elizabeth Pennisis, “Bipolar drug turns foraging ants into scouts,” Science Magazine (31 December 2015). The original article is, Daniel F. Simola, Riley J. Graham, et al., “Epigenetic (re)programming of caste-specific behavior in the ant Camponotus floridanus,” Science 351 (January 2016).
 Bergen, Louder Than Words.
 Wagner, Arrival of the Fittest, loc. 2836.
 Carl Zimmer, A Planet of Viruses (Chicago, 2012).
 Arthur, The Nature of Technology, 24, 88.