Christopher Columbus and the Poster Child

Just in time for Columbus Day, this arrived.

IMG_0202

I had no idea what it was, but it was a big, smooshy package, so I opened it up. Inside I found this:

IMG_0211

and this:

IMG_0212

(note misplaced apostrophe in “The Chinn’s)

and this:

IMG_0213

on the other side of which was this:

IMG_0214

(why I’d want a wall calendar with my last name at the top isn’t clear)

and this:

IMG_0215

and this:

IMG_0216

and this:

IMG_0217

and this:

IMG_0218

and this:

IMG_0219

All of this merch came courtesy of St. Joseph’s Indian School in Chamberlain, South Dakota, which serves the Lakota nation. Along with this immense pile of stuff was a letter that began “Dear Ms. Chinn. You could be a dreamcatcher.” It goes on to explain that “the Lakota (Sioux) believe good dreams and nightmares float in the air and a special willow frame strung with sinew can screen out nightmares…They call the ornament a ‘dreamcatcher’ and put one in every tipi and on the cradle board of every baby.”

First of all, you’d be hard pressed in 2014 to find many Lakota babies attached to cradle boards, although some are handed down generations and used for special occasions. Second, the dreamcatcher was not a Lakota custom until the Pan-Indian movement of the 1960s and ’70s; it originated among the Ojibwe and found its way to other nations through their political and intercultural connections. The kicker, though, is how this description ends: “Unfortunately, there aren’t many good dreams left for the Lakota people.” The letter goes on to characterize Lakota life on the reservation: “poverty, illiteracy and hopelessness — a nightmare fate that befalls so many Native Americans.” Finally, it asks readers to “help bring dreams of God’s hope to the Lakota children.”

Just in case that didn’t tug at your heartstrings enough, tucked in next to this typed letter is  this:

IMG_0221

a “handwritten” letter from “Josh Little Bear” (not, the note informs us, his real name, which has been “changed to protect his privacy”), addressed directly to me. Nestled between the grainy black and white photo of a Lakota child (is it “Josh” himself?) and a child’s drawing of a tipi surrounded by dark houses is “Josh”‘s message to me. “Dear Ms. Chinn,” he says, “When I wake up in the morning, I thank God I’m at St. Joseph’s Indian School.” The cause of this thanksgiving is that St. Joseph’s has saved him from the danger of the reservation: “Sometimes my dad drinks and hits me. Not long ago my mom left me at my Grandma’s house and she said she didn’t want me anymore. She chose drugs over me.”

I don’t know whether the PR team that handles the St. Joseph’s account made up this letter — certainly, the fact that the handwriting is pretty clearly a computer font suggests that they did. What’s most striking to me is the implicit story this letter tells. Lakota children are without hope, victimized by drunk and violent fathers and junkie mothers, dumped onto grandmothers who can’t handle the responsibility. The reservation is a place of despair, and you, the prospective donor can provide hope. “Josh Little Bear” is the poster child for Indian degradation.

Paul Longmore, the much-missed historian of (among other things) disability in the United States, had a trenchant analysis of the phenomenon of poster children like “Josh Little Bear”  in the popular imagination. Associated with “innocent suffering,” poster children existed in a cultural and familial vacuum, “dependent objects of beneficence.” Dissociated from the possible taint of working-class parents and separated from any kind of community of other disabled people, the children who were trotted out every year in Jerry Lewis’s Muscular Dystrophy telethon, or on Easter Seals posters were empty signifiers of need, filled with meaning by people willing to pony up the cash. More recently, as it’s become decreasingly acceptable to pimp out disabled kids as fundraising tools, organizations like Children International and ChildFund (formerly the Christian Children’s Fund) have used images of impoverished black and brown children to raise money through “sponsorship” of individual children.

What all these organizations have in common with St. Joseph’s Indian School is the narrative of rescue and salvation. All you have to do is call the 1-800 number, pull out a credit card, and save a child from poverty and/or disability and/or disease and/or illiteracy. Like their disabled counterparts, these children seem to live outside of any context: community, family, culture. They are atomized representations of degradation. Of course, this isn’t a new story: it’s the founding logic of much European colonization of the Americas, Africa, Asia, and other parts of the world. It finds its origin, at least in part, at the very beginnings of European explorations into the Americas, as Christopher Columbus found himself in the Bahamas in 1492.  Wandering among the indigenous people there, Columbus remarks that as far as he can tell, the locals were “without any religion that could be discovered ; they had never remarked the indians whom they kept on board the ships to be engaged in any sort of devotion of their own, but they would upon being directed, make the sign of the cross, and repeat the Salve and Ave Maria with the hands extended towards heaven.”

This passage raises a number of questions, not least of which is why these people were being “kept on board ships.” However, most striking is the assumption on the part of Columbus and his men that the people they meet are “without any religion” and that they can easily be taught the fundamentals of Christianity. Whatever religious beliefs the local Lucayan people may have had were unrecognizable to Columbus’s crew. Their family ties were insignificant, so they could be kidnapped without compunction. At the very beginnings of what we now call American history, we see one of the main threads of European imaginings of indigenous people: blank slates on which white benefactors can write whatever story they like.

The story of the reservation as a place of despair has some validity: indigenous people are among the most impoverished in the United States. Their rates of substance abuse, domestic violence, and suicide are disproportionate to their numbers in the population. However, as Indian writers like Sherman Alexie, Louise Erdrich, Joy Harjo, Leslie Marmon Silko, Paula Gunn Allen, Chrystos, N. Scott Momaday, and many others have shown, reservations are people’s homes, where they grow up with deep roots, and where they navigate the many obstacles to health and safety that all poor, colonized people face. Alexie’s young adult novel The Absolutely True Diary of a Part-Time Indian follows Junior, a Spokane teenager whose life is much like “Josh Little Bear”‘s: his father drinks too much, his sister runs away from home, family friends are killed in bar fights and drunken motorcycle accidents. But Junior’s feelings towards his family are tender, and even as he leaves the reservation, he’s fiercely loyal to it and the world in which he’s grown up. White people don’t rescue Junior: some of them help him, others victimize him, and still others view him as totally alien.

This is a story of indigenous life that St. Joseph’s School doesn’t just ignore — it actively erases it. Reservations can only be “nightmares”; Indian parents can be nothing but at best neglectful and at worst abusive. Indian children must be victims of a culture that aims to destroy them. After all, who better to blame for the destruction of indigenous cultures than native people themselves? With its dreamcatchers and address labels and wall calendars and notepads, St. Joseph’s is carrying on the work Columbus began, dressed up in 21st century finery.

Leave a comment

Filed under Uncategorized

A (black) child is being beaten

This year I finally joined my family’s fantasy football league. I’m fairly indifferent to football — I’ll watch it if that’s what everyone else is doing, but the two years I was teaching in Virginia I don’t think I watched one game of my own accord. But last year my father-in-law died, and so this year I took over his slot in the league. My first draft was Adrian Peterson, the Minnesota Vikings running back. I chose Peterson for several reasons. First, he’s probably the best running back in the NFL and as a new team I got first pick in the draft. Second, my old friend Sarah Kelen is a Minneapolis native, and has inspired in me a love for all things Minnesotan, including the Vikings (this was during the early 1990s, the Warren Moon era, when the Vikings were a pretty great team).

Then came Peterson’s arrest for child abuse: beating his four year old son with a hand-fashioned switch.  Peterson referred to his actions as a “whupping,” and his representatives invoked the image of the caring, if stern, father: his attorney, Rusty Hardin, explained that “Adrian is a loving father who used his judgment as a parent to discipline his son. He used the same kind of discipline with his child that he experienced as a child growing up in east Texas.” This explanation launched an intense and ardent debate on the internet about the meanings of “discipline.”  Former NBA athlete Charles Barkley weighed in, arguing that corporal punishment is part and parcel of a black Southern childhood. In his words, “whipping — we do that all the time. Every black parent in the South is going to be in jail under those circumstances.” Miami Herald columnist Ana Veciana-Suarez likened Peterson’s actions towards his child to Ray Rice’s earlier violence against his then-girlfriend. And, most notably, former Vikings wide receiver Cris Carter spoke out passionately against Peterson’s actions, relegating them to a past era in which parents made mistakes that current parents should not repeat. “This is the 21st century; my mom was wrong… And I promise my kids I won’t teach that mess to them. You can’t beat a kid to make them do what you want them to do.”

Notably absent in all of this is Adrian Peterson’s son, the recipient of this violence (which we can imagine was not an isolated incident). His name is not mentioned in the news reports; he is described only in the context of the injuries he suffered. Given that a preschooler, it’s hard to know how clearly he could articulate what his father’s violence meant to him, although the fact that doctors found defence wounds on his hands suggests that he resisted the beating. What’s most striking to me, though, is the idea, articulated by Barkley, Hardin, and others of Peterson’s defenders, that this kind of violence is a routine part of disciplining a child, an experience that all black children go through and, implicitly, emerge from unscathed.

We know, though, that black children are hardly untouched by violence that is visited on them or that they witness. Looking all the way back to 1845 we read Frederick Douglass‘s account of his initiation through “the bloodstained gate” into what he called “the hell of slavery,” which came from watching the severe beating of an aunt when he was “quite a child” (certainly younger than seven or eight when he went to live in Baltimore, and most likely younger than the five or six years old at which children were put to work at simple tasks). US slavery was in effect organized around the institutionalized abuse of children. As Anna Mae Duane points out in her pioneering book on children and violence, Suffering Childhood in Early America, children were a significant population in the transatlantic slave trade from its beginnings. She cites the case of the 1734 journey of the slave ship Margarita, 87 percent of whose human cargo was under the age of 16, with an average age of thirteen. More chillingly, she notes that another slave ship, the Henrietta Marie, “had over eighty pairs of shackles designed expressly for children’s hands.”

With the abolition of the international slave trade in the US in the early years of the 19th century, slave children were a constant feature of the domestic trade: after all, the only way to increase slave owners’ stock was through reproduction. Abolitionists were particularly fascinated by the phenomenon of very light-skinned  enslaved children, the obvious result of sexual violence against enslaved women.

As Duane shows, violence against black children was defined in contradictory ways. The first was the accusation against enslaved parents that they were so violent towards their children that slave owners had to rescue the children from “their excesses of cruelty or rage.” The second was the claim that enslaved people were themselves children, incapable of caring for themselves, let alone their own children. But by the mid-19th century there was an even more insidious narrative about violence towards children in slavery, one that is described by Robin Bernstein in her field-changing book, Racial Innocence. Bernstein demonstrates in precise detail how over the course of the nineteenth century, black children were increasingly defined as “insensate” — incapable of feeling physical or emotional pain. Even as (or perhaps because) white children became the symbol of innocence, black children were represented as pickaninnies, smiling and laughing in the face of sometimes mind-blowing levels of violence.

As  Bernstein argues, US culture constructed black children as appropriate recipients of violence from their white counterparts: the manufacturers of black dolls included hitting, throwing, and smashing them as part of appropriate play for white girls. Illustrations of girls at play showed them whipping and even lynching black dolls. This understanding of what black dolls were “for” extended well into the twentieth century. Bernstein convincingly revises our understanding of the Clark doll studies in this context:  when black children were asked to “give me the doll you like to play with,” they were in effect given the choice between “a white doll that prompted cuddle play and a black doll that scripted play of violence and servitude.” And told to choose the doll they felt most similar to, these children were being asked to identify with an appropriate victim of abuse.

This history doesn’t simply disappear. To me, Charles Barkley’s rationalization that black children must be beaten to be disciplined echoes slavery- and Jim Crow-era beliefs  that “pickaninnies” don’t feel pain like you and me, that they can and should be subjected to heightened levels of violence to make a dent in their behaviour.  Of all the legacies of slavery, this is one of the most entrenched, and even romanticized (“my daddy whupped me and look how well I turned out”). As we are still reeling in the wake of the murders of Eric Garner and Michael Brown, we can see that the myth of the unfeeling, super-resilient black child is still doing its violent work.

Leave a comment

Filed under Uncategorized

Bending Towards Justice

A follow-up to my last post. I mentioned the situation of my son’s flag football team to a friend who’s a journalist. He offered to get in touch with a friend of his at DNAinfo.com, a site that specializes in local news. A couple of days later, a DNAinfo reporter phoned us, did a phone interview, and then, a couple of days after that, came to take a photo of my son and me. Here’s the article that resulted from that.

The next evening, we got a call from a reporter at the New York Daily News, who had read the DNAinfo article. He spoke to me, my partner, and our son. The next day, this article came out. A couple of days after that, a friend emailed me telling me that Keith Olbermann had mentioned the issue on his program on ESPN (here it is).

The upshot of all this was that the league changed the team name to the Bears (ironically, just that past week, Mike Ditka had come out defending Dan Snyder and the use of the team’s name). Not only were we thrilled that we wouldn’t have to cringe while cheering on our son, we were also delighted that we could use this as an example of protest actually having material effects.

Then this morning, this arrived.  

16080_10152752809241004_6975939625384376669_n

It was a letter from the Oneida Nation and the President of the National Congress of American Indians, accompanied by a gorgeous eagle feather decorated with ribbon. The letter thanked him for “standing on the right side of history” in opposing the name “Redskins” for his flag football team. As they said at the end of the letter, “it is a tradition for Native Americans to give feathers to those who gave shown kindness to another person and, today, we are giving this feather to you to symbolize our appreciation for your efforts.”

It’s hard to overstate how moved I was by this. The letter was eloquent and beautifully written, acknowledging the importance of political protest even on the small scale of youth flag football. In many political movements, it’s difficult to see any immediate result. I was involved in the movement against apartheid in South Africa in the mid-1980s, but to be honest, I couldn’t imagine the end of white supremacy, the release of Nelson Mandela from prison, and his election as President shortly thereafter. Similarly, it took years of activism to change the cultural violence of AIDSphobia, not to mention the development of meaningful treatment. The recent dust-up over Steven Salaita’s de-hiring at the University of Illinois  Urbana-Champaign has spurred a cascade of petitions, letters whose signatories have vowed not to engage with UIUC in any way. But we’re more than three weeks into this situation and all the UIUC administration has done is reaffirm and rationalize the original decision.

Activism requires a fractious and fragile balance between hope and resignation, imagination and pragmatism. We must believe that, in the words of the organizers of the World Social Forum,  “another world is possible.” But that world seems so far away, so difficult to attain, that we also have to face facts, recognize that the people who are most benefiting from our work more often than not are opposed to our tactics, that direct action does not usually immediately result in social change. 

So, this letter from indigenous people welcoming my son into the community I’ve long been a part of, a community organized around the struggle for justice for all, feels momentous to me. It reinforces my instinct that more than coalitions (a buzzword of the late 1980s and early 1990s), we need alliances, collaborations, interactions. And that ultimately the arc of history can bend — slowly, slowly — towards justice.

Leave a comment

Filed under Uncategorized

Playing Flag Football, Playing Indian

My 12 year old son is an avid sports fan and hard-core athlete. This past Spring he was on two basketball teams (one in Brooklyn and one in the Bronx) as well as a nationally-ranked track team. But his great love is football. He’s in two fantasy football leagues and has played flag football for the past two years. And this is where my story begins.

Last Sunday we got an email from his flag football league, St. Francis Xavier (SFX). The league is sponsored by the NFL, so all the teams are named after NFL teams (although there aren’t enough kids to field an entire “league,” so not all the professional teams are represented). Last year and the year before, he was in the Broncos in the 10-12’s age group. Now he’s in the 12-14’s his name went back into the hopper to be assigned to a new team. The email was to let us know that he had been assigned to the Redskins.

As you might know, there’s been an ongoing campaign to have the Washington, DC franchise change their name. The National Congress of American Indians put together this ad to be aired during the 2014 Superbowl, explaining why the believe the team should change its name from what is indisputably a racial slur. Washington owner Dan Snyder has argued that the name  “represents honor, represents respect, represents pride,” and is part of a long tradition of respect for indigenous people.  Indeed, when my partner wrote to the coach of our son’s team to protest the use of the franchise’s name, he echoed these sentiments, maintaining that “the team name has great positive connotations for many. They’re a venerable beloved franchise with an extraordinary history. We hope to teach the values of determination, courage, and honor associated with them.”

The term “tradition” comes up a lot in regard to this issue. And to a certain extent, Snyder and the flag football coach aren’t wrong. The use of images and names indigenous Americans has a long tradition in US history. As Philip J. Deloria shows in his 1999 book Playing Indian, ever since the Boston Tea Party, white Americans have been appropriating indigeneity to define their own Americanness. By taking on Indian names and ersatz rituals, as various fraternal orders modeled on indigenous peoples did (most famously the Tammany Society, but also less well-known fraternities like the Order of Red Men as well as, much later on, the Boy Scouts), white American men could align themselves with the American landscape even as they attempted to eradicate indigenous people from it. Enacting Indianness, in Deloria’s words, “legitimated [white Americans] as aboriginal, and carved out a distinctive masculine identity for them that transferred the right of residence to them.”

Team names are only a part of this phenomenon. In fact, my kids already got a taste of playing Indian at their sleepaway camp this year. The bunks are named after Indian nations (Wichita, Pawnee, Chicopee, etc), and their colour war competition is known as “tribals.” They don’t dress up in fake Indian clothes or enact “war dances” as campers in the 1950s and ’60s did, but the appropriation of indigenous names — almost wholly divorced from geographic specificity — is part and parcel of the camp experience.

The issue of the name of the Washington, DC team (and its flag football avatar) seems to me both qualitatively and quantitatively worse, though. After all, I can’t imagine that many people would find the use of the term “redskin” unproblematic in any context except football. We’re still figuring out how to respond to the coach’s specious response. Our son, by the way, is furious not just that he’s been assigned to this team, but that SFX would choose to include it in their roster, and he’s willing to protest against it. We’re thinking about our next move…

Leave a comment

Filed under Uncategorized

The Public (Anti-)Intellectual

By now, I’d imagine that you’ve read (or read about) Nicholas Kristof’s column on how academics just don’t know how to talk to regular folks. Kristof criticizes academics for being out of touch, overly specialized bloviators who are more interested in quantitative analysis and technocratic jargon than reaching out to the public. I don’t really want to do a point-by-point analysis, not least because my friend and CUNY colleague Corey Robin did such a terrific job a few days ago on his blog. Corey’s analysis gets to the heart of the issue: there are, in fact, first-rate writers putting out excellent analysis on-line, but they’re not neoliberal blowhards like Kristof who have the bully pulpit of the NYT.

Moreover, as several people have pointed out on Twitter, a major way that academics speak to “the public” is that we teach. I don’t know how many of my students read Nick Kristof on a regular basis, but thousands of them have studied  with me over the past 20 years that I’ve been teaching. And a significant part of my work is encouraging my students to think of themselves as intellectual actors rather than just recipients of knowledge.

But beyond the specifics of whether academics do or don’t have anything to say in the public square, I’m more interested in the theme itself, which seems to reappear every now and then. In a nutshell, it’s this: oh you eggheaded academics! Why can’t you talk to the common person about interesting things? This is hardly a new development. Richard Hofstadter wrote the groundbreaking Anti-Intellectualism in American Life in 1966, for God’s sake, and he traced complaining about people who think they’re smarter than everyone else back to the very beginnings of the American Republic.

I’d argue, though, that the culture’s highly ambivalent relationship to knowledge production is not just about intellectualism but a horror of ambivalence itself. In his 1837 Phi Beta Kappa address at Harvard University, on the theme of “The American Scholar,” Ralph Waldo Emerson (no plain-spoken pragmatist himself) railed against scholasticism and leveled the same charge of monasticism that Kristof lobbed. “Instead of Man Thinking,” Emerson argued,” we have the bookworm. Hence, the book-learned class, who value books, as such; not as related to nature and the human constitution… Hence, the restorers of readings, the emendators, the bibliomaniacs of all degrees.” The problem was not just with bookwormdom, though — it was the feminizing of the American intellectual class, which abjured”the rough, spontaneous conversation of men.” Even as refined and complex a thinker as Emerson recognized that the nuance, specificity, and self-consciousness of responsible intellectual work conflicted with the increasing identification in the nineteenth century of masculinity with spontaneity, physicality, and certainty.

Anti-intellectualism, after all, is a refusal to dwell in the unknown and the unknowable; the hallmark of the most anti-intellectual US presidents, from Andrew Jackson to Teddy Roosevelt to Ronald Reagan to George W. Bush has been declarativeness. Jackson’s famous (if possibly apocryphal) response to Supreme Court Chief Justice John Marshall’s rebuff of Indian removal – “John Marshall has made his decision; now let him enforce it” — is the archetypal argument of physicality against intellect. Marshall might be handing down decisions from the empyrean heights of the Supreme Court, Jackson reasoned, but he wouldn’t last a minute on the ground against the guns of the US Army, which, of course, carried out the brutal removal of the Cherokees and other indigenous people in violation of the Court’s mandate.

More recently, military violence has alternated with a kind of cultural violence to clear away any kind of doubt or questioning of public policy. Reagan’s invocation of “welfare queens” in his 1976 presidential campaign wasn’t just deceptive (although it certainly was that as well). Rather, it was Jacksonian in its certainty and its willingness to victimize a population in order to achieve a political goal. And W’s most characteristic bon mots about the “axis of evil,” “mission accomplished,” and “either you’re with us or you’re with the terrorists,” borrow directly from Reagan’s playbook.

In this context, it’s hardly surprising that the New York Times, home of such czars of certainty as Kristof and Thomas Friedman, didn’t run an obituary of one of the finest public intellectuals of the post-1945 era, Stuart Hall. While news outlets such as Al-JazeeraThe Guardian, NPR, and even that bastion of Thatcherism The Telegraph, issued lengthy analyses of Hall’s impact on not just  British intellectual life but leftism, anti-colonialism, and liberation movements worldwide, the Times was conspicuously silent. Hall’s legacy, to my mind, is not just his immense contribution to analyses of culture, media, and representation, or his commitment to public intellectual discourse, characterized by his work with the Open University. What I found most remarkable about Hall, and most impressive, was his willingness to change his mind, to listen to criticism, to open up conversations in new directions. That is, his openness to not being right all the time.

One of my favourite essays by Hall is “Cultural Studies and Its Theoretical Legacies” (not the grabbiest title, I’ll admit). Throughout, Hall argues that cultural studies was above all a process of learning, change, and contestation. As far as Hall was concerned, cultural studies’ best work was done when the people who thought they knew what they were doing were proven wrong — by feminism, by postcolonialism, by analyses of race. In his words,
“the so called unfolding of cultural studies was interrupted by a break, by real ruptures, by exterior forces; the interruption, as it were, of new ideas, which decentered what looked like the accumulating practice of the work.”

For Stuart Hall, the urgency of action could not, should not trump the centrality of critique, ambivalence, uncertainty. This is the exact opposite of the Kristof method, in which rescuing “victims” is more important than understanding the processes of victimization, in which intellectual care plays second fiddle to unreflective pronunciamentos.  Perhaps if the Times had printed an obit for Hall, they might have included his analysis of what it means to be a public intellectual engaged with real political struggle: “The notion of a political practice where criticism is postponed until the day after the barricades precisely defines the politics which I always refused.”

 

4 Comments

Filed under Uncategorized

Lavender Lake redux

In the wake of the disastrous chemical spill in West Virginia and the crazy weather we’ve been having across the US over the past month, I’ve been thinking a lot about climate change and pollution. I was chatting with the folks down at Revolution Books the other day, and one of them said “the only way to reverse the destruction of the planet is revolution!” Well, fair enough, but I’m increasingly convinced that right now we’re living with decisions made more than a century ago, when pollution ran unchecked. My mother grew up in London in the 1940s and 1950s, and I remember her telling me about the  pea-soup fogs she grew up with, especially Great Smog of 1952, which enveloped the city and killed about 12,000 people from respiratory illness. A lot of that air pollution was caused by coal-burning fires, which various Clean Air Acts did away with; when I was growing up in the 1970s, the London air was remarkably clear.

These kinds of cause-and-effect environmental measures are pretty easy to see: reduce the pollutants and you get less crap in the air, people can breathe more easily, and everyone doesn’t end up coated in particulates. Essentially, that’s the difference between London and Los Angeles (or, as some researchers from UCLA found, even between different neighbourhoods in LA) or, to push the contrast harder, between London and Beijing. And, as we’re finding, what pollutes in Beijing doesn’t stay in Beijing, but finds its way around the globe, with the added irony that this pollution is in significant part a result of factories in China making crap that would otherwise have been manufactured, along with the attendant particulates, in Europe and the US.

But what we’re looking at now in terms of climate change has long and deep roots. As this terrific website by the American Institute of Physics shows, we’re living with the results of events that started off in the nineteenth century and then just kept going. More importantly, we’re having to deal with the aftermath of an idea common to the beginnings of industrialization: that the damage we do is short-term and reversible. Of course, the discovery of climate change came pretty shortly after the discovery of the fact of various ice ages, which people had nothing to do with.  And everyone recognized that major environmental events like volcanoes had commensurate effects on the weather. But, as the AIP argues pretty convincingly, there was a significant lag between doing the damage and recognizing it. Soot from factories in Western Europe found its way onto the Alps, causing a glacial retreat of over half a mile between 1860 and 1930, even though the planet was going through the Little Ice Age, which caused temperatures to drop throughout Europe by almost 2 degrees Fahrenheit.

An image like this does a pretty good job of showing that the planet’s been going through climate change steadily (if unevenly) for quite a while.

111111

But these numbers don’t include other kinds of equally common pollution from the nineteenth century, whose legacy we’re still living with now, water pollution in particular. Before the Clean Water Act of 1972, Philadelphia’s sewer system dumped raw and partially treated sewage into the Delaware River. As this website shows, creating a sewage system in the city may have spared Philly residents the horrors of typhus and cholera, but it also took over creeks and tributaries to establish sewage lines, and before filtration the sewers did little more than remove waste from homes and redistribute it into various rivers and streams.  Nearer to home for me, the Gowanus Canal was a nasty soup of effluvia — sewage, industrial waste, and paint factory run-off, within a few decades of its construction in 1869 (apparently, it was so disgusting and weird-looking, it was nicknamed “Lavender Lake.” I had no idea).  It’s hard to know what people thought would be the end result of all this dumping. It’s possible that they just didn’t care, or that they assumed that cities had always been repositories of different kinds of filth, and this was just the most recent incarnation of that. Certainly, the focus of the American Transcendentalists (taking a leaf from the book of their British and German Romanticist predecessors) on the purity and sacredness of nature allowed ruling class Americans to essentially write off urban and industrial areas as inevitably tainted, and focus instead on the unspoiled parts of the landscape. Plus, as a recent article by Jonathan Rosen about the extinction of the Passenger pigeon suggests, white Americans just assumed there’d be more of whatever they screwed up available: more land, more water, more resources.

It’s the combination of these beliefs that seem to endure today, with a nice dose of neoliberal individualism and climate denialism thrown in for good measure. In the late 1920s, when Monsanto starting dumping PCBs into the Hudson, it’s not clear that anyone thought that the river couldn’t just clean itself. Strip mining and mountaintop removal were accompanied by assurances that once they were finished, coal companies could put the landscape back together again: no harm, no foul (and lest we think that strip mining is a thing of the past, this article shows that it’s still kicking).  Likewise fracking in Pennsylvania and, if gas companies have their way, New York State.

My kids are thrilled for the snow day they’ll be getting tomorrow. Taking the longer view, I’m less inclined to see smooth sledding.

1 Comment

Filed under Uncategorized

“Intolerance and Fearfulness”

I suppose it’s time to talk about partisanship and acrimony here, no? We’re a week into the government shutdown, and tempers are flaring. And needless to say, people are starting to talk that history talk. Here at The Longest View, we approve of taking a historical perspective, but most discussions don’t go much further back than the 1995 Clinton/Gingrich shutdown, or at most back to the 1980s.

At the same time,  I’ve noticed a desire, especially on the part of the right wing of the Republican Party to enlist the Founding Fathers as spiritual supporters of the Congressional stalemate (Newt Gingrich‘s take is especially bracing). In fact, in the years of the early Republic, the question wasn’t whether the government was going to shut down, but whether the country itself was going to hold together.

The political conflict that simmered during Washington’s presidency, but that the reverence in which Washington was held kept under wraps during his two terms, erupted in the 1790s: private and public discourse was, in the words of John R. Howe, characterized by a “spirit of intolerance and fearfulness that seems quite amazing,”  and partisan distrust was rampant.

Once Washington was out of office and John Adams took over the presidency, the rancor grew exponentially.  By the mid-1790s, disunion was discussed openly; and as soon as threats to secede became part of the national conversation, it seemed increasingly likely. Sectional antipathy between North and South grew and every crisis brought with it talk of impeachment, from the signing of the Jay Treaty in 1794 to the Alien and Sedition Acts in 1798. Among all social classes, a furious debate raged over the legacy of the revolution. Those who had been radicalized by the struggle for independence challenged the social and political elites, claiming that they violated the spirit of republican democracy.

The anti-federalist, populist Democratic-Republican societies that formed between 1793 and 1794 (and became the foundation of Jefferson’s Democratic-Republican party) characterized the years after the revolution as a tragic betrayal, in which the the democratic promise of the “Spirit of ’76” had been corrupted by too much money and power concentrated in too few hands.  The repeated theme of these clubs was that the “lower sort” – artisans, labourers, and small farmers – were not reaping the benefits of democracy that the new nation promised.  And while there was certainly overlap between the grievances voiced by the Democratic-Republicans and the participants in both the Whiskey and John Fries’s Rebellions, and both groups emerged around the western frontier of Pennsylvania, Democratic-Republican societies were not primarily organized around economic issues. Rather, they challenged the political corruption that they believed had come to infect American life. Ironically, as US democratic process became more routine, it seemed increasingly under threat.

The antipathy of the 1790s dissipated after Jefferson’s election in 1800 (although there were significant fears that Jefferson’s inauguration would lead to secession for real, fears that resurfaced in the debates over the War of 1812). At no point, though, did either Federalists or Republicans argue that government itself was the problem. These men actively hated each other. They spread scurrilous rumours about each other, drank toasts to each other’s deaths, encouraged insulting newspaper reports, and even dueled to the death.  They had very, very different ideas about how the new country should be run: who should be able to vote, how political power should be distributed.  But they were deeply invested in the success of the government, a system that they had seen come into being.

Of course, one can take this analogy only so far. The Congress itself was constituted very differently at the beginning of the 19th century, not least because the Senate was elected by each state’s legislature (a practice that seems terrifying now, given the way many states have fallen into effective one-party rule). The electorate was much smaller, much more homogenous. And yet, there were still deep and hostile political divisions that fell along partisan lines that became more calcified as the nation settled into itself.

The splits we’re experiencing now have been a part of American political life from almost its very beginning.  But the belief that politics should somehow devour the government that make them possible is a new development, and a striking one.  Without government, there would have been no American politics: indeed, we might argue that US politics have been so acrimonious over the history of the nation because Americans have been so invested in government, local and national. To shut down the government, as the Republicans are finding, is to shut down political discourse, to shut down the ground on which partisanship itself is laid out.  Without government, political conversation stops.

Leave a comment

Filed under Uncategorized