All posts by Paul

About Paul

I'm a historian, rock climber, and musician living in beautiful Fayetteville, West Virginia. I like to over-analyze everything and put it into historical context, hence this blog.

Random Thoughts on the Cruz/Sanders Sweep of Utah

The results of the Utah primaries/caucus– in which Cruz won in a landslide for the GOP, and Sanders in a landslide for the dems– are fascinating, anomalous, and once again really reaffirm the fact that this state was and still is one of the most culturally and religiously homogenous states in the union.

Analysts have been making a big deal of how both Trump and Sanders are riding a wave of anger at the political establishment– they’re both on the fringes of their parties, “outsiders,” and speaking the language of populism as it has not been voiced in over a generation (or arguably since the era of Huey Long and Father Coughlin). So, it would make sense that the two would both win in places with strong grassroots political insurgencies (Vermont, New Hampshire, for example).

Other places that are more mainstream and establishment, like Ohio, would go for more “establishment” candidates, like Clinton and Kasich.

So, why would Utah go for a GOP candidate that is very much a more establishment, insider, and a democrat who is quite radical?

Well, in the case of Cruz, it’s pretty obvious. Cruz is an ideologue, well educated but dogmatic, smart but certainly not an intellectual.  He’s the closest thing to an establishment candidate.  He fits the best with Mormon values this cycle.  Trump does not.

I think it tells something about the faith and culture that Trump is not gaining Mormon supporters at the same rate that he’s getting evangelicals in the Bible Belt.  Utah social conservatism has long been an enigma when compared to other strongholds of social conservatism like the Deep South, Central Midwest, or even other parts of the rural West.  Most of these places have had similar hostilities to same-sex marriage, pro-choice women’s issues, and even liberal alcohol laws.  But Utah, unlike, say, the bible-belt south, does not have the same dismal statistics of high school dropout rates, unemployment, divorce, teen pregnancies, drug addiction, or (here’s the big one) deep-rooted racial issues.  Historically, this is because although Mormonism is deeply conservative, it originally was descended from northeastern Puritanism, rather than slave-state evangelicalism, but that’s another story.

Returned-missionary Mormons tend to have seen more of the world than most Trump supporters, and they are turned off by foul language.  Furthermore, Mormonism has a deep history of not flaunting one’s wealth.  Perhaps even a few Mormons have heard Trump’s ugly comments profiling Muslims, and thought back to their own history of religious persecution.

The one demographic of Mormonism that I think may have fallen into the Trump camp would be those Far Right, pseudo libertarian survivalist types– those rooted in the views of Ezra Taft Benson and Cleon Skousen who view social liberalism/big government as a plot of Satan.  This fringe strain of Mormonism most recently got attention with the Bundy clan occupying the wildlife refuge in Oregon.  But, while their anger is certainly Trumpish, The Donald never came to these guys’ support; when asked about the Oregon occupation, Trump merely said he’d “tell those guys to get out.”  No support here.

Ok, but WHY on earth would Utah democrats go for Bernie Sanders?  Utah Republicans and Mormons (yes, I know they are not interchangeable, but seriously, the Venn diagram would be pretty close-fitting) are going for the more establishment, so why aren’t democrats?

This is a more complex question than the one about Trump and Cruz.  Utah does not have the deep history of Rust Belt, working class, unionist democrats that places like Ohio does, and this is the demographic that Clinton seems to be going for.  But I actually think that the root of Sanders’ popularity in Utah also is traceable back to the state’s strong Mormon presence.

It is difficult, VERY difficult, to be a moderate Mormon in Utah.  From firsthand experience, I can say that it usually involves doing complex logical gymnastics to justify your faith with your politics; gymnastics that usually throw people off the bar.  And once you’ve decided to leave the faith, your entire community, social structure, often family, quite literally FORCES you to be a radical.

A friend of mine who spent a summer in my hometown of Logan, Utah once from back east made the comment, “it’s a weird place, you’re either 100% Mormon, or a complete alcoholic stoner pillhead.”  While my own experience with Utah is more nuanced than that, I understand what he was saying.  Mormonism treats so many things as taboo, that young kids breaking away from the church and rebelling often do not stop to think that drinking coffee and huffing gas are completely different pursuits from one another.  Both are seen as “bad” by the culture and religion, and once you’ve rebelled, it is up to you to distinguish between them.  And not everyone can.  My graduating class from Logan High School had more than its share of drug overdoses, suicides, and meth addicts (though certainly not as many as my current home of West Virginia).  I think it was BECAUSE of, not in spite of, the Mormon culture.

This may be reason for the surge of Bernie– if you are a liberal in Utah, you have no reason to be a moderate liberal.

Either way, I’m proud of my state for continuing to buck the demographic stereotypes.  The Trump Train will not be pulling into Promontory Point anytime soon.

Advertisements

Little Darling, it’s been a long, hot, muggy summer…

Today is August 22, 2015.  It’s pretty nice.  The greenery outside the front door of my microcabin at the campground I run says “summer,” but the crisp temps and breeze say, “fall is on the way.”  Hopefully, within the month, we’ll be in full-on Sendtember, with Rocktober immediately thereafter.  The best time of the year here at the New River Gorge is right around the corner.

However, I have not been climbing much these last few weeks.  I’ve never been a particularly gifted natural athlete, and sport specific training has always been pretty key to me performing at even sufficient levels. So for the whole month of August, I’ve put climbing on hold, and devoted myself to doing hangboard exercises a couple times a week.  These “hangbored” workouts– in which I hang off of tiny edges with up to 50 lbs of weight dangling off of my harness– are exhausting, tedious, and sometimes painful.  Friends ask me “why I can’t just come climb” in between my workouts, not realizing how it exactly feels to have pumped, strained muscles 24/7.  I’m still able to climb a bit, and occasionally will even take a lap on Apollo Reed, one of my favorite sport routes ever, just to warm up for hangboarding.  But other than that, I’m not climbing.  The increased finger strength come September will be worth it; I’ve been here before.

Serious climber is serious.
Serious climber is serious.

Part of living here at the New River Gorge, where I am just minutes away from the best rock in the nation, has involved me accepting that in the summer I will just not climb much.  As I get older, and less obsessive (while still improving), I’ve found that I need time away from the rock.

***

It hasn’t been that much time off that I’ve taken, however.  For most of June and all of July, I managed to keep surfing along on the fitness that I’d gained from my pretty intense spring training, when I climbed Moonlight Buttress.  I made a few trips to South Nuttal, an off-the-beaten-path crag here that has one of the most impressive line-ups of 5.12 traditional cracks that you’ll find anywhere.  It was fun; I managed to send an Eric Horst route called “New Traditionalist” (5.12) that might be the best fingercrack at the NRG.  Following the queue of local badass Pat Goodman, I also made some gains on a BEAUTIFUL Brian McCray crack called “Temporary Insanity,” which definitely feels harder than any of the other cracks I’ve been on, it’s probably more 5.13 than 5.13-, and requires a full arsenal of jamming skills as well as ample bouldering power.

Pat Goodman putting up a beautiful 12+/13- crack at South Nuttal, as part of his "training" for alpine expeditions.  It was amazing that this route had not been done before!
Pat Goodman putting up a beautiful 12+/13- crack at South Nuttal, as part of his “training” for alpine expeditions. It was amazing that this route had not been done before!

Eventually, the hot summer temps and encroaching poison ivy got me less psyched on South Nuttal, though I’ll be back in the fall.  However, right as this was happening, I managed to fall in with a great new partner, Stacey K., who was living at the NRG for June, and had more psych to climb hard in the hot temps than anyone else in town.  It was pretty impressive to watch her tick off classics like the ultra-crimpy “Black Happy” (5.12) at Endless Wall, during very grim summer conditions of high temps and humidity.  I managed to put away a couple more 5.13s while climbing with her, including the VERY memorable send of the endurance route “Eye of Mordor” at First Buttress of the Meadow River during a full-on hurricane-force horizontal rain and wind storm.  This brief spike in fitness even manifested itself in an almost-send of the famed “Triple Crown” at Lake Summersville (three 5.13s in a day), but the smarmy handjam crux move of “Pod” (5.13b) thwarted me.  Like I always say, an almost-send is still NOTHING!

Me working on "Mercy Seat," 13a/b, at Lake Summersville.  I did not send it during this photo sesh, but would come back and do it during the hot June temps.  Woohoo!
Me working on “Mercy Seat,” 13a/b, at Lake Summersville. I did not send it during this photo sesh, but would come back and do it during the hot June temps. Also, please note my Blue Ridge Outdoors tshirt.  Thanks BRO!

Anyway, after mid-July, I finally threw in the towel for hard climbing in the summer.  Since then, I’ve been getting on the lower New River, an incredibly stretch of whitewater, a LOT, and fine-tuning my whitewater guiding skills.  Whitewater is another longstanding passion of mine; I actually worked through college and part of graduate school during summers as a raft guide on the Green and Colorado rivers in Southern Utah, and it’s been amazing to have world-class whitewater accessible to me as a quick, after-work option.

Multisporting is fun!  Not pictured: the boat.
Multisporting is fun! Not pictured: the boat.

I’ve also been throwing in a healthy amount of weight training and core workouts with my hangboreding, as well as 2-3 cycling trips a week in which I ride down and back up out of the New River Gorge.  Hopefully this regimen will keep me in good overall shape for the fall, despite my favorite Mexican Restaurant DiOGi’s opening back up and temping me with their nachos and margaritas!

***

Without a doubt, however, the biggest obstacle for my fall climbing season is going to be my old enemy: TIME.  In addition to managing the local climbers’ campground here, my work as a freelance writer has really been taking off.  On top of that, I’ve taken a job at one of Fayette County’s chronically understaffed high schools as a full-time English teacher.  It’s been amazing so far; not only will the money help for future road trips, but I truly love teaching, and hope to be able to make a difference by applying my skills as a former college professor to some of the most disadvantaged and peripheralized demographics in the nation.  Exposing rural hillbilly kids to Malcolm Gladwell?  Fuck yeah.

But the problem is that I moved here to climb.  I began substitute teaching last year for the good money, and because of the flexibility in being able to work when I wanted and no more.  Now, pulled back in by my love of actually designing and teaching my own courses, I’ve lost that flexibility.  Part of me hopes that my school will be able to find a qualified and licensed English Teacher to replace me, and I’ll be able to go back to being a dirtbag.  We’ll see.

Either way, I really hope to have a good fall, even if it means running myself into the ground and burning the candle at both ends (forgive the double metaphor).  I’d like to take down Greatest Show, Temporary Insanity, Thundering Herd, and The Racist, all lifetime goal routes for me here at the NRG.  Beyond that (and providing I can get out of this teaching gig), the big goal is to be able to take my truck into the desert Southwest for all of December, January, and February, spending weeks and weeks at Hueco Tanks, Red Rocks, Zion, Joshua Tree, and more.

I’ve been managing to ride the fine line between dirtbaggery and being a responsible adult for a while now; let’s hope I can keep pulling it off.

The Climber’s Progress, or, Insecure 20-something Masculinity and the Ideological Foundations of Trad Climbing

(Intro)

I’m not really a huge fan of most climbing literature. Royal Robbins’s attempts to turn his visionary ascents into high writing, John Long’s modified campfire-stories, Mark Twight’s dark fatalism– most climbing writers get stale quickly to me, even if they are incredible athletes. There are a few exceptions: John Sherman’s humorous and occasionally self-deprecating essays of climbing culture are the ancestors of today’s better climbing blogs. And even more so, my favorite climbing writers– David Roberts, John Krakauer, even Matt Samet– are those who let climbing serve as a constant background for their stories, rather than put it at center stage. They’re not climbing writers; they’re writers who climb.

 But still, climbing literature or not, my nearly half-a-lifetime on the rocks has found its way into everything from these blogs to my book (on a public radio interview last year I even tied Edmund Burke’s idea of “the sublime” into the climber’s idea of “type II fun). More than a few ideas have formed as I’ve applied some of my favorite historical themes (recreation, environmentalism, frontier mythologies, rigid ideologies) to my own evolution as a climber over the past seventeen years, and it seems like the more I gradually slink away from academia and the historical profession, the more integrated climbing becomes into my writing.

Anyway, here’s a little piece about when I was a clueless gumby.

 ***

I first touched the chossy, slippery limestone of northern Utah’s Logan Canyon at the age of nineteen in 1998, tying into a 5.8 toprope with a webbing harness, borrowed chalkbag, and my hightop Vasque hiking boots. It was a mixture of the new and the familiar. I was in decent shape, used to hiking up to 30 mile days, and even had some rudimentary climbing techniques under my belt honed from years of slot canyon scrambling in the Canyon Country. The first route I climbed was a sort of chimney feature, and the stems and mantles that I employed to get up it were familiar; it was fun to experiment with the moves on toprope, and without the consequences of a broken ankle and possibly a 20 mile evac.

I did not know at the time about the sport versus trad ethical debates that were just winding down in the late 1990s. I didn’t know about climbing gyms beyond the county fair, or the emerging phenomenon of first-generation gym rats like Sharma or Lindner who were moving quickly from plastic to 5.14 rock at the time. I did not know that at the time, Salt Lakers 90 minutes to the south were at the forefront of American sport climbing and bouldering. I did not know what the back story of a family friend’s comments were, when he would rant to my parents about how “those guys hanging off of bolts in China Wall Cave (probably Boone Speed on Super Tweak, 14b) didn’t know shit” about REAL climbing, and how he had climbed a route right near Super Tweak back in the 1970s.

You know nothing, young climber!
You know nothing, young climber!

Despite not knowing any of this, I was in some ways set to become a grumpy, elitist, ideologue member of the Tradiban from day one. I approached climbing from a perspective of outdoor adventure, not athletic or gymnastic improvement. I was a wilderness advocate, environmentalist, even very briefly an eco-saboteur, and obsessed over backpacking and isolated river trips. Occasionally, I’d encountered bolts on clifflines in the backcountry and been disgusted at the crass hubris of leaving chunks of metal in a natural landscape. This was no different than paved trails, or housing developments being carved into canyons for that matter.

Perhaps above all, although I certainly was just barely starting to be aware of it at the time, I measured the ultimate value of my outdoor experiences in terms of isolation and solitude. A socially awkward virgin who still lived with my parents, steeped in the misanthropic essays of Thoreau, Abbey, and self-described “solitudarian” Colin Fletcher, I went into the backcountry mostly alone or with one or two friends. My ultimate goal was to find stunning natural places where nobody else was– we would rather be on drab BLM land alone than on stunningly beautiful NPS land with others, and were quite willing to turn away from trailheads where even one other car was parked. I hated any mark of human presence on my backpacking trips, even kicking over cairns along many Southern Utah hiking routes (I’m pretty embarrassed even today to admit to this, and I stopped doing it after getting lost while hiking back to the car along a route where I had obliterated some cairns a few days earlier).

This had actually turned me off from climbing through high school. I disliked its reliance on partners, and even more the crowded, social aspect of those noisy roadside crags that I would drive past on my way to a backcountry trailhead. Most of all, I hated the cliques and scenes that so many types of outdoor recreation fostered. This aversion probably had its origins in my family’s smug, cynical tendency to dismiss anything that was popular or trendy, be it pop music, team sports, or mainstream TV shows. It was also most definitely shaped by my working for an “old-school” backcountry outfitter in the tiny, isolated town of Torrey, Utah. We took pride in being removed from the yuppie crowds of Moab. In going out of my way to avoid the cliquishness of outdoor rec, I embraced horses and cattle herding over mountain biking, oil leather over Gore-tex, Carhartt over Patagonia, duck-taped liter bottles over camelbacks, Cabellas over REI.

Despite all this, I found that climbing was easily the most addictive outdoor activity that I’d ever done. I could spend time outside, there was a quantitative and simple platform for measuring improvement, and (gasp), it might even help me meet girls! I bought the toprope basics– shoes, chalkbag, rope, harness– and even got a volunteer job at the Utah State University’s small bouldering gym.

Thrutching around steep Logan canyon limestone, while still maintaining my 90s grunge image.
Thrutching around steep Logan canyon limestone, while still maintaining my 90s grunge image.

But still, a lingering part of me felt guilty for indulging in this perceived new-school, cliquish activity. My non-climbing friends and family did not let me forget it; “Oh, you’re going to go hang off the side of the Fucoidal Quartzite crag again? Why do you enjoy climbing right next to the road? Why don’t we go hiking instead, away from people?” Fortunately, after about a year of easy toproping up Logan Canyon, I discovered a way to resolve my older solitudarianism with my newfound love for climbing. It was this thing called “trad.”

 ***

Disclaimer: for the purposes of this essay, and this point of time in my climbing evolution, I’m defining “traditional climbing” as “the placing of removable gear on a route,” rather than clipping bolts. This is neither the historical definition, nor my personal definition today, but what is important was that this was how I saw it around 1999, when I was a cocky, clueless 20 year old.

I didn’t know any trad climbers well (although they definitely existed in my hometown of Logan). In climbing magazines I would come across photos of Yaniro, Suzuki, or Hong, putting these expensive cams into cracks, and then almost unnaturally, mystically, moving themselves up granite or sandstone walls simply by twisting their hands, fingers, and toes into these same cracks. Having only climbed on slippery, breaking limestone, I could not comprehend how cams and crack technique would even work. They seemed as inaccessible to me as climbing 5.11.

Then, all this changed. For President’s Day of 2000, I had originally planned on doing a canoe trip down Labyrinth Canyon on the Green River. However, at the last minute the friend with whom I’d planned on boating changed his plans. There was a girl he was chasing who had invited him to come to some place called “Indian Creek” to climb as part of a larger group. He bailed on the canoe trip, but obligatorily offered to let me tag along– “yeah, I guess you can come. It’s probably going to be really lame though.”

I had no other choice, was angry that I’d been forced third-wheel style into a cliquish group (that included MORMONS! Ugh), but figured that, hey, at least I was going to get down to the Canyon Country that I loved.

It was only as we were driving south on I-15, my friend, myself, and the designated “trip leader” packed into the front of a compact 2wd pickup truck, that I found out what we were in for. I asked Trip Leader a few questions about the climbing we were going to do, and found out that, GASP, we were about to CRACK CLIMB, with TRAD GEAR! Having spent years hiking through Wingate Sandstone, I was instantly worried about climbing on soft rock like this. The only saving grace was when Trip Leader informed us that he would lead all the climbs, and we could toprope them (yes, we were that group). But still, crack climbing? I was scared.

The next morning, after waking up amidst the familiarity of a sunny desert winter morning, we hiked up to the unfamiliarity of Supercrack Buttress. There were only perhaps three other groups there (yes, I’m playing the Old Man, Back-in-the-Day card here), but I was already apprehensive about seeing crowds of people in the Canyon Country, my Canyon Country. When we got to the base of our first route (Keyhole Flakes), I noticed the white chalk caking the outside of the crack. I didn’t know what to think about this; my sense of aesthetics and leave-no-trace ethics was repulsed.

Recalling some SUWA (Southern Utah Wilderness Alliance) pamphlets I’d read about Leave-no-Trace climbing, I asked Trip Leader, “Aren’t we supposed to be using red dirt-colored chalk?” He just laughed as he was racking up, “Nobody uses that stuff!”

Ok, then.

Despite my initial misgivings, I found that toproping these climbs– Keyhole Flakes, The Wave, Incredible Handcrack– was incredibly fun. I was strong, and had decent body awareness from the gym, and just layback sprinted up everything in lieu of actually using jamming technique. Eventually I even learned some rudimentary handjamming skills, and took to it pretty quickly. Later that day, we moved across the canyon to Battle of the Bulge buttress for some 5.11s, a grade that I’d never come close to climbing in Logan Canyon. I toproped the shit out of some corners, again mostly laybacking, all while thinking, “wow! This is awesome! I want to do more of this!” By the end of the day, after witnessing our Trip Leader take a bad headfirst whipper on Cave Route, and even badly toprope-thrutching up some 5.12s (Digital Readout and Swedin Ringle), I was hooked. My friend who had invited me was less psyched; the jams hurt, the girl he was chasing showed up to the crag with another guy, and he wound up just leaving us to go hiking by himself while I was quickly running from one toprope to the next. I think the last thing I heard him say was something about how “granite is so much better.”

My friend Scott, tenuously stemming up Cave Crack (5.10+/11-), probably because his Miuras were sized too tight.
My friend Scott, tenuously stemming up Cave Crack (5.10+/11-), probably because his Miuras were sized too tight.

For the last day of the trip, we stopped off in Moab on the way home. I bought a set of stoppers and the old, Xeroxed Indian Creek guidebook at Pagan Mountaineering. Although my friend wanted only to go home, Trip Leader informed me that there were some great first trad leads on Wall Street, so we stopped off and I led 30 Seconds Over Potash (5.8), and then Flakes of Wrath (5.9). It was awesome.

Upon returning to Logan, I was changed, and had figured out how to resolve my solitudarian backcountry enthusiasm with the newfound love of climbing– TRAD! I would sit through my Environmental Ethics philosophy course, where we would talk about the struggle and balance between preservation of natural landscapes versus enjoyment and exploitation of those same resources. I would sit through my US Western History course, where we would discuss Frederick Jackson Turner’s thesis that American identity had been shaped by the rugged, individualistic meeting point of “civilization and savagery” on the Wild West Frontier. It was clear to me that trad climbing was both environmentally responsible, AND individualistically bold. I was quite clearly awesome.

Between classes, I would run into the gym for quick sessions, my chest puffing with a new elitist pride as I projected the perfect handcrack that crested the entire roof of the gym, while lowly sport climbers looked on in amazement that I had unlocked the secret of the handjam. Then I would go home and read climbing magazines about rad trad ascents around the world. I would especially dwell upon the intros to various guidebooks to Indian Creek and the Canyon Country. The words of grumpy old trad climbers like Eric Bjornstadt and Stewart Green all emphasized the sheer badassery of desert crack climbing. “5.10 here is HARD.” “You have to hang off of pure jams, there are no face holds at all!” “The rock is really soft, you might die!” Along the way, I started putting together a trad rack, and found one or two partners who were psyched on driving the six hours south to Indian Creek every weekend.

Laying back Coyne Crack, around 2001.  Back when it was still 5.12a.
Laying back Coyne Crack, around 2001. Back when it was still 5.12a.

All of this reinforced this idea: I was incredibly rad for trad climbing, the same way that I had been rad for solo backpacking or riding 30 miles through a thunderstorm while carrying a newborn calf that was shitting all over me. Because it was uncomfortable, kind of scary, and because most people that I knew did not do it. The toproping cliques, the USU outdoors club, they could all fuck off. I saw myself as a Turnerian rugged individualist for embracing the danger of trad climbing, and a Leopoldian deep ecologist for not clipping bolts.

I was also ideologically rigid and clueless, as much as my 21 year old contemporaries who were discovering Ayn Randian Objectivism about the same time.

Crack climbing also involves Rand Smears.
Crack climbing also involves Rand Smears.

The thing is, most cocky 20-something males who eventually figure out that Objectivism is shallow and unrealistic do not do so by nearly killing themselves. After a season of climbing RadTrad at Indian Creek, I was a dangerously confident gumby. As far as I knew, trad climbing ONLY involved placing removable gear on lead. I didn’t know anything about ground-up, onsight ethics, let alone rope management, anchor building, or multi-pitching. My friends and I, cluelessly strutting about Supercrack Buttress, jumped on every 5.10 crack we saw, falling– or more frequently hangdogging– our way up them before eventually redpointing them. We were sport climbing, but did not know or think it.

In the early 2000s, desert crack climbing still had a mystique about it; mutant gym rats from Boulder or Salt Lake City had not yet discovered how easy Ruby’s Café could be to them with just a few weekends of work on basic crack technique. Guidebooks and internet forums were full of the whole “Woo crack climbing is hard and dangerous!” mythology much more than they are today. We reveled in it.

I fell right into this mythology by thinking that because I could hangdog myself up a 5.10, G-rated handcrack, it meant I was a 5.10 trad leader. It took a runout, footwork intensive 10a seam in Little Cottonwood Canyon (Equipment Overhang Right), and a stopper that pulled right out when I yelled “take!” to show me otherwise, as I took a 35 foot whipper that nearly cratered me.

***

Looking back at this, it is both scary and humorous. There is not a much more dangerous demographic in this world than an early-20-something male. The objectivist business major, the marine recruit, the suicide bomber, the gang member, or the clueless trad-wannabe solitudarian, they are all seeking to secure their masculinity and establish themselves in a complex world through a paradox: asserting their individuality but also gaining acceptance, by way of pre-established ideological templates.

My climbing evolved further after that near-groundfall in Little Cottonwood showed me new revelations such as “trad climbing does not involve yelling TAKE on gear.” I moved to Dallas, Texas for graduate school, where I found that for one to be a climber in a big city that is hours from the nearest rock, one often has to strengthen their identity as a climber much more than if he lives in Utah. I devoted myself to training in a climbing gym for the first time, spending every weekend climbing steep limestone sport or bouldering in Austin, old school runout trad at Enchanted Rock, or even bolted multipitch in Mexico. Gradually, though I did not notice it at the time, I found that the distinctions between sport, trad, and bouldering did not have to be rigid, and that each discipline fed off of and benefited the other. I also eventually came to the realization that just because you place gear in cracks that is removable does not make you unusually bold or environmentally conscious.

Furthermore, as I matured and became more involved in the Texas climbing community, I gradually moved away from my solitudarian roots. One cannot find the same solitude in North-Central Texas that there is in Southern Utah; the only outdoor activity I could do was climbing, and it most definitely was NOT backcountry, wilderness, or isolated. I began enjoying the social aspects of climbing, of getting to know people in the gym and on the rock, building friendships around the sport, and even finding an awesome girlfriend who liked climbing too (seriously!).

When I started climbing, a regular apprenticeship– under the tutelage of a more experienced mentor, with mock leads, seconding up easy granite multipitches with passive pro and bomber rock– would probably have been better, and instilled a more thorough perspective of humility and history in me. The toprope-to-Indian Creek route was not the best way to go (even though I still reap the benefits of having learned crack technique early and intensely).

I am nonetheless glad that I came to climbing from the direction of outdoor adventure rather than the direction of the gym. I still enjoy ground-up, onsight traditional climbing, sport climbing, bouldering, and headpointing– all very distinct disciplines. Living in the tight-knit community of Fayetteville, West Virginia, I’ve fallen more in love with the social aspects of climbing than my 19-year old self ever could have comprehended. But I still enjoy the occasional relapse of solitudarianism, pulling into an empty parking lot, hiking out to an unpeopled crag, and dangling over a cliff, just a rope and myself.

What a Little Moonlight Can Do….

In 1776, the Franciscan friars Dominguez and Escalante obtained funding from the Spanish Crown to seek out a new northern route from Santa Fe, New Mexico, to the new colony of Monterey, California.  However, once they were in the wilderness of the Colorado Plateau, they focused more on converting Indians and seeking out new mission possibilities, rather than finding a quick route across the Great Basin.  This had actually been their main objective all along.  Kind of like when a climber gets invited to come give a historical lecture in Zion National Park, when his main objective is climbing Moonlight Buttress.

IMG_0375

Yup, I went there in my talk, and the climbers in the audience laughed, all knowing full well that 24 hours earlier this bespectacled nerd spouting off historical facts and theories had been groveling up 1000 feet of sandstone finger cracks.  And while I want to emphasize that I would have made the trip to talk about my book Wrecks of Human Ambition even if climbing had not been on the table, I’m not going to lie– the prospect of getting a paid trip to the red rock country to do my two favorite things, climb and talk about history, was a dream come true.  Thanks Zion Canyon Field Institute!

IMG_0379

Moonlight Buttress (10-ish pitches, 5.12ish) has hovered in my consciousness since shortly after I first started climbing in the late 1990s.  I first heard about it when one of our Utah State University climbing community members aid soloed it over spring break.  We all thought it was a big deal that he was “soloing a 5.13 big wall!” (I didn’t know the difference between aid and free climbing at the time).  A few years later, a friend of mine, also aiding it, nearly died.  She rapped off the end of her rope while bailing off of the fifth pitch, and was only saved when a tangle of slings self-arrested her mid-fall (yeah, it’s complicated).

As my years as a climber progressed, several of my friends and partners from Indian Creek began getting on the route as a sort of final exam in the crack techniques that the Creek fostered.  I wanted to get on it, but found great reasons to put it off.  My multi-pitch resume was pretty thin.  I wasn’t a solid 12+/13- crack climber.  Then I moved to the humid East, first to Texas, then Ohio, then West Virginia, and desert crack climbing faded back into distant memory, even as I matured and improved as an overall climber.

Then, this past winter, I got back to the Southwest, mostly for long, moderate routes in Red Rocks.  It was nice to be back in the desert.  Although I love my current home at the New River Gorge, and stand by my hyperbolic statements about its Nuttal Sandstone being the best medium for rock climbing ever, the desert southwest will always be my first love, and true home.

It was during this time that I also finally made the acquaintance of Dan “Climbing Trash” Snyder, whom I’ve known through various rock climbing websites for damn near a decade.  We’ve got a few commonalities in our backgrounds– we’re both cultural “Jack Mormons,” we both have chosen to live in small town hubs of outdoor recreation, and we’ve both spent way too much time dragging tourists through canyons, over trails, and down rivers as backcountry guides.  In addition to letting me crash at his house in Virgin, UT (where gun ownership is legally mandated), Dan also hooked me up with some folks he knew who worked for the Zion Canyon Field Institute.  It turns out that they were psyched on having me come out in April and give a talk on the history of humans doing stupid stuff in the desert.  And of course, the first thing that came to mind was, “Whoa, I’ve GOT to climb Moonlight Buttress!”.

Fast forward to the week of April 22 (Earth Day!).  I flew into Salt Lake City, rented a small compact car, and made the obligatory 12 hour visit to family in northern Utah before driving south on I-15.  I’m accustomed to being a dirtbag, driving across the country and spending months living out of my truck, so this new method of travel with flights, car rentals, motels, and travel receipts felt strange.  I hadn’t even packed a sleeping bag!

It was also strange to come back to an area where I’d spent so much time as a child.  My grandfather, the late, brilliant landscape artist Harrison Groutage, was the first person to instill a love of the desert into me.  He’d built and lived in a beautiful vacation home just south of Zion through the 80s and 90s, painting countless views of the West Temple, Kolob Terrace, and Smithsonian Butte from his north-facing studio window.  Although I’d never climbed in Zion when I’d spend time at his house, it nonetheless felt like I was coming home.

A View of Zion in watercolor by Groutage
A View of Zion in watercolor by Groutage

Anyway, enough of this sentimental reflection.  I rolled into Virgin around dark on Monday night.  Climb Tuesday, book lecture on Wednesday, maybe climb again Thursday.  I knocked back a few Knob Creek-Dr. Pepper cocktails with Dan (the guy loves his sugar), and discussed the upcoming climb for the next day.

Looking out of Zion Canyon toward the communities of Springdale, Rockville, and Virgin.
Looking out of Zion Canyon toward the communities of Springdale, Rockville, and Virgin.

I had not had luck finding a partner whom I was confident getting on such a big, hard climb with.  Ideally, a perfect partner would have been someone who could swing leads, and was solid on the grinding, sometimes painful nature of long, desert cracks.  But although I sent out a wide-ranging message to my “dream list” of partners who I knew might be in the area around then, nothing came through.

Finally, less than a week before my trip, Dan simply offered to jug the route.  This offer blew my mind.  Contrary to what a lot of people assume, jugging is hard work, in some ways just as exhausting as free climbing.  Dan had been either guiding or working as a brickmason for several weeks with no days off, and I wondered if he knew what he was getting into with this offer to jug and carry the pack on a “rest day.”  However, he’s tough, has been climbing for decades, and most importantly stays positive even in exhausting situations.  I’ve bailed off of big walls before because partners became negative and complaining, but I knew that Dan would not do this.

Still, this offer of jugging brought its own challenges.  I’d be leading every pitch, and the impetus to get up the route rested solely on me.  This would be a change from all other long, hard routes that I’d done, such as Red Rocks’ Rainbow Wall or Potrero Chico’s Sendero Luminoso, in which I was climbing with partners who were much better than I was.  The pressure was on!

Although I’d been training hard in the months leading up to this climb, and was in very good shape as far as endurance goes, there were plenty of things I could have done better in preparation for Moonlight.  I could have scheduled a longer trip to brush up on my neglected desert crack technique.  I could have climbed more pitches of trad back at the New River Gorge (I think I led one pitch of 5.11 gear that entire spring).

Shoulda, woulda, coulda.  I didn’t know what my exact goal for Moonlight was.  I knew that I wanted to give it a very good attempt at onsighting (actually, more like flashing, since I’ve watched so many videos and talked to so many  people about it), but was pretty sure that I would get bouted.  I thought that maybe, if I didn’t completely get my ass handed to me and did it with just a couple mistakes, I might try to get back on the route on Thursday.

Anyway, we got up at 5:30am the next morning; I had no appetite, but put away two cups of black coffee and two peanut butter/banana burritos.  We packed food, water, and cigarettes for Dan.  One 70 meter rope, one gri gri, ascenders, and a shit ton of cams, none larger than a red camalot.  I was particularly wary about the half dozen purple camalots we had, since that is by far my weakest size of crack (a couple millimeters bigger than a fingerlock).  We drove through Zion Canyon as the sun rose, feeling extra special with the VIP pass that we’d gotten from a ranger, which allowed us to drive into the shuttle bus-only section of the canyon.  The approach was chill; easy river crossing, easy scramble to the base of the route.

On the first four pitches, which are basically the approach to the six-pitch 5.12 splitter and corner finger cracks, I got off route a couple times, but felt great.  At the base of the 5.12 section, a ledge 350 feet up called the “rocker block” we converged with two other parties: a pair of free climbers, and a very fast-moving aid climber who also had his own jug/support sherpa.  Both groups were very chill; we sat on the ledge, bantered about mutual acquaintances and beta, and watched as another group made its way up from the base of the route.

Before you summon up the rage of the internets to say
Before you summon up the rage of the internets to say “OMG he’s peeing on a classic route!” I’d like to emphasize that I was urinating off of a large ledge into the wind, which dissipated all waste before it even hit the ground. The solution to pollution is dilution.

Gazing up the imposing corner, I could make out fixed anchors, plenty of tickmarks, and even some chalk scrawlings on the wall that said “B” and “Y”– I realized later that some goober was reminding himself where to put blue and yellow cams.  Oh well, this was not a wilderness route, it was not even an adventure route; it was just hundreds of feet of glorious finger crack.

Hanging out on the rocker block ledge.  All pics by D. Snyder.
Hanging out on the rocker block ledge. All pics by D. Snyder.

The corner pitches– the first (pitch 5 of the entire route) is a hard v5-ish boulder problem to a 5.11 fingercrack, and the second (pitch 6) is a wild layback/stemming affair– went smoothly.  The 5.12+ “crux” layback sixth pitch has probably gotten a bit wider over the years, because I got tips jams the whole way.

Looking down the amazing corner of pitch 5 to the rocker block ledge.
Looking down the amazing corner of pitch 5 to the rocker block ledge.

Pitch 7 was the one which I had heard the most about being awkwardly hard, and it definitely took a lot out of me: a physical squeeze chimney up to a point where you reach WAY back into a corner for a flared ringlock, and then have to make a 180 degree rotation from facing left to facing right.  I must have accidently read this the right way, because I managed to get the rotation, and even flexed my fat hips to get a no-hands position in the hardest part!  Unfortunately, in the enduro off-fingers layback above, the pump finally caught up with me, and I took a little fall.  Booo!  We made it up to a really nice ledge at the base of pitch 8 (a beautiful 12a finger splitter, best climbing on the route), where we ate, drank, and lounged around, waiting for the aid party to get further ahead of us.

I fell once more that day, on pitch 9, which I thought was the hardest of the route, with 30 feet of off-fingers splitter.  After this point, the route turned into really cool, but kind of scary face and pinscar climbing.  Pitch 10, a 12a called the “Nutter” pitch, was a struggle; I was digging pretty deep into the reserves, and there was one moment where I stopped, 15 feet above a tiny tcu in soft rock, and thought, “holy shit, if this was a single pitch route at the NRG, it would be the day’s highpoint if I onsighted it!  I’d go home and start drinking!”  But in the context of this huge route, it was just another challenge that I had to bang out almost mindlessly.

The only photo of me on the sharp end, somewhere around pitch 8 or 9.
The only photo of me on the sharp end, somewhere around pitch 8 or 9.

One more pitch of 5.10+ handcrack over a little roof, then some juggy slabbaineering and we were at the top.  Even with the other parties on the wall and the leisurely pace, we managed to do the route in about nine hours.  After a quick jaunt down the West Rim trail and a few conversations with tourists, we were drinking margaritas in Springdale.  Damn good day.

Me and the Danimal at the summit.
Me and the Danimal at the summit.

I was pretty happy with how we did on Moonlight Buttress.  No epics, no all-out ass kickings, just good, tired fun.  Who knows, maybe if I had been swinging leads, instead of leading every pitch, I would have had a better shot of onsighting it, but I was psyched to have done the thing in good time, with just a couple falls.  Unfortunately, however, my body was so wrecked, and my fingers so sore from the endless fingerlocks that I knew there was no way I could go back on Thursday to redpoint the pitches I had fallen on.  We went cragging, and I barely made it up a single-pitch 5.11.  Three weeks later, and my fingers STILL hurt.

My throbbing fingers the next day, as I prepared my lecture and wallowed in misery.
My throbbing fingers the next day, as I prepared my lecture and wallowed in misery.

In terms of the training I did, I was happy with my approach, and the constant mileage of steep sport and gym routes was key to building my endurance.  But again, who knows, maybe if I had been able to go cragging for a couple days in Zion or Indian Creek, again I maybe, just maybe would have had a shot at actually onsighting the route.  But I can’t be disappointed at all; this was a fairly “off-the-couch” desert climbing experience for me (in terms of the rock type, not fitness).

Three days later, I was back at the New River Gorge, climbing single pitch, bombproof sandstone in 80% humidity.  The contrast could not be greater.  My fitness, which I had been training by periodization to peak for Zion, predictably plateaued out by late April as well.  Now, as the Appalachian Spring is gradually giving way to summer, my body and fingers have still not yet fully recovered from Zion, and I can tell that I desperately need a break from climbing for a month or so.  Fortunately, whitewater season is just around the corner.

Without a doubt, this was one of the best climbing trips I’ve had, despite its brevity.  Although I did not send (hopefully I’ll get to return to finish the route off), I identified a “dream route” that I’ve wanted to do for over a decade, trained specifically for it, and gave it a great go.  The fact that I was able to incorporate this into my literary and intellectual life only added more to the experience.  Climbing, history, and landscape have always been intertwined for me, whether in humid Appalachia or the arid Southwest.

Spring Climbing: Status and Goals

Ok, folks, here it is, my very first CLIMBING BLOG post, prompted by the fine folks at Misty Mountain Threadworks and Blue Ridge Outdoors magazine who have been awesome enough to give me some support this year.  If you’re a regular reader of past blog posts, please forgive me for a lack of overly-analytical, pseudo intellectual jargon here.  I’m just going to talk about climbing.

“I don’t want to write about climbing; I don’t want talk about it; I don’t want to photograph it; I don’t want to think about it; all I want to do is do it.” –Chuck Pratt

First off, just a brief summary of where I’m at right now, and where I’ve come from as a climber.

As of this season, Spring of 2015, I am 36 years old.  I first roped up when I was 19, which means that I’ve been climbing for nearly half of my life.  I learned first by toproping outside, and later by leading trad at Idaho’s City of Rocks and in the Southern Utah deserts.  I never really spent much time in a gym until I moved to Dallas, TX for grad school in 2002, and became a regular at Exposure Rock Gym.  This was the first place that exposed me to really strong plastic pullers.  Dallas in particular has dozens of adolescent mutants who will scamper up your projects like cockroaches.  I didn’t really do specific training, but rather just bouldered a lot, getting fairly solid in the v5/5.12 range.

Much later, in 2008, I wound up moving to Columbus, Ohio, where I was easy weekend distance from both the Red and New river gorges.  I also had access to an excellent co-op gym, Kinetic (which has since evolved into a great commercial facility).  About this time, I began climbing with Mike Anderson, who is one of the foremost experts in climbing-specific training, and recently authored The Rock Climber’s Training Manual with his equally strong brother Mark.  Climbing with Mike really opened my eyes to specific climbing workouts in the gym that went beyond just bouldering around on plastic.  I won’t go into details here (you can find them on his site, or by searching “rock prodigy training”), but suffice to say that it was specific training– hangboarding, campusing, core exercises– that finally got me to break into the 5.13 range in 2011.

In 2013, I left academia and opted to move to the New River Gorge, where I could put climbing and laid-back Appalachian living on the front burner of my life.  My first season here at the NRG was good.  I had just come off a foot injury, had really gone overboard on training upper body strength, and managed to jump back into 5.13s pretty quickly.  However, eventually the lack of gym access caught up to me, and I began getting gradually weaker.  It is actually really hard to stay in top shape by climbing outside all the time, especially at the NRG, where weather and hot summers can mean weeks of sub-par climbing conditions.  I devoted the season of 2014 almost exclusively to trad climbing– slightly easier than my max, but much scarier.  You can read about this season here.  But I continued to get weaker, and by this past fall, I was wondering to myself if I had gone over the hill, in my mid-30s, with no career, and a mediocre climbing life that was approaching the point of diminishing returns.

Which brings us to the 2015 season.

This past winter, I took a month roadtrip to the desert Southwest, which I had not done for a while.  Scenery, adventure, and new routes took precedence over physical difficulty, and I mostly focused on big, 1000+ foot routes in Nevada’s Red Rock Canyon.  It was amazing to say the least, and got me in better cardio shape than I’ve been in a long time.  However, my climbing performance suffered.  I don’t think that I sent a single thing harder than 5.12a that whole trip.

Noobing it up at Red Rocks, NV.
Noobing it up at Red Rocks, NV.

Perhaps the most productive thing I did this whole trip, believe it or not, was manage to get a lecture booked at Zion National Park, where I would discuss my recent book “Wrecks of Human Ambition,” a history of the red rock canyon country.  And the added bonus?  I’m hoping to climb Moonlight Buttress, over a dozen pitches of mostly 5.12 fingercrack, while I’m out there.  This route has been a lifetime goal for me since I started climbing, and I am beyond psyched, no matter what the outcome is.

Anyway, in February, I took a 180 degree turn from the wilds of the desert, and spent the month living with my girlfriend Karen at her home outside New York City.  We skied, did a bit of ice climbing, and ate way too much restaurant food.  However, my main climbing objective for this month was simple:  SPEND TIME IN THE GYM!  I knew that I was lacking the sheer, explosive power that comes easily to climbers in their 20s, and that I would have to build this, and then gradually transition into the sort of stamina that I would need for Moonlight’s grueling pitches of enduro-jamming.  Don’t laugh, I really needed this gym sabbatical in a NYC winter.

By mid-February, I had rediscovered my long-lost bouldering mojo, and was able to do v7-8 of varying styles fairly consistently, in various gyms.  I then gradually started mixing it up with more endurance sessions, in which Karen and I would go to the gym early in the morning to avoid crowds, and knock out dozens of roped routes back-to-back.  Gyms are awesome!

By the first of March, I had returned to the New River Gorge, and began what I knew would be a rough transition to applying my gym fitness to the subtle weirdness of Nuttal Sandstone.  I also began a quick, controlled calorie restriction diet that would allow me to drop 6-7 lbs in this final phase (it was nice to train bouldering power while I was a bit heavier, and eating awesome food in NYC).

It’s been a bit challenging, but I think that things are starting to fall into place.  I’ve gotten solid on all the vertical, techy 5.12s of Endless Wall that I love so much, and also thrown in some steep, endurance routes just to get me used to operating under a full pump, which will be key on Moonlight Buttress.  The best possible training for Moonlight would be massive laps on single pitch cracks at Indian Creek, but I can’t complain with what I’m working with here.  Just yesterday, I really felt that my endurance was coming together, when I managed to redpoint a new 5.13a second go, and then cooled down by running a lap on a 5.12+ that I had previously sent two days before.  It felt good.  There are three other 5.13 routes in the area that I have whittled down to one-hangs, and will hopefully send this spring.

***

Anyway, I’d like to wrap up with a brief list of various goals for this year.  The big one is Moonlight Buttress, and honestly even the single pitch projects I have here at the NRG are just “training” for it.  But here are a few other things I’ve got on my radar.  Here’s to hoping I can squeeze one more good season out of my mid-30s!

Black Rider (aka the Pocket Route), 5.13a, Endless Wall, New River Gorge:  I’ve put a few runs into this route, and have one-hung it a couple times.  Most people with whom I’ve climbed it float through a techy crux involving a shallow, slopey pocket, but have trouble at a thuggish roof pull immediately afterwords.  I’m the opposite; this pocket crux gets me every time, and I’ve got an ever-present gash on my right index finger from a nasty crimp too, but have never had problems with the roof pull.  This route is in the shade in the afternoon, so I’m putting it on the backburner while I devote more time to sunny routes that will soon be too hot.[EDIT:  SENT!]

The Racist, 5.13b, Endless Wall, New River Gorge:  Not gonna lie, this thing is beyond me right now.  Despite the fairly “low” grade, this climb gets done way less than the popular 5.14a “Proper Soul.”  I’ve gotten gradually more comfortable with the reachy, intricate, and small holds of the lower 2/3s, but the upper two cruxes are still going to take a lot of work.  Adding to this, the route gets sun most of the day, and is pretty much a winter route.  May have to give it a rest until November, but if there is one route I’d love to throw myself at dozens of times, it is this one.

Mercy Seat, 5.13a/b, Colosseum, Summersville, WV: I’ve disliked this climb for a long time.  It sits right next to Apollo Reed, a route that is without a doubt my all-time favorite pitch of sport climbing, and which I have douchily repeated around 60-70 times.  Unlike Apollo, this route has some very insecure, weird movement, especially a weird twist/pull over a gravel conglomerate roof.  Despite this, just yesterday, I hopped on it, and managed to get up to the final crux fairly easily, for the first time ever!  This climb is good throughout the summer, so I won’t be aggressively trying it this spring. [EDIT: SENT!]

Titan’s Dice, 5.13a, Endless Wall, New River Gorge:  I have not yet gotten on this route this season, but it felt pretty doable last year, and is a good combination of both technique and all-out enduro burliness.  Plus, how often do you get to use full-on offwidth technique on a sport climb? (EDIT: SENT!)

Greatest Show on Earth, 5.13a trad, Meadow River Gorge, WV: This thing is beautiful, and I think that my bouldering over the winter is finally making its core-intensive roof crack sequence a possibility for me.  It is going to take a lot of work, though, and definitely is one of the hardest traditional lines I’ve put time into.

Me on Greatest Show last fall, photo by Pat Goodman
Me on Greatest Show last fall, photo by Pat Goodman

Thundering Herd, 5.13b trad, Beauty Mountain, NRG, WV: I toproped this a couple times in January, and while it gets a higher grade than “Greatest Show,” it is much more straightforward.  No lead attempts yet, but my last time on it while solo toproping, I managed to climb the entire crux sequence, which basically comes down to two v6 boulder problems separated by an all-out glory dyno and great jug rest. Unlike quite a few of Pat Goodman’s other FAs, this one is fairly safe to lead.

BONUS: Glass Menagerie, 5.13a trad, 8 pitches, Looking Glass Rock, NC: This is another longterm lifetime route.  I bailed on it last year due to wetness, bad ropework, and, errr, a few other things I don’t want to talk about, but if I manage to keep my fitness through the spring, this might be fun to go work on in May.

So, there it is, my throwing-down of the gauntlet for Spring 2015.  Stay tuned for my follow-up post where I complain about being weak, and the rock being too slippery.

There is no Golden Age

“What the Founding Fathers actually meant was…”

“Well, if you really read into the Bible, you’ll see that Jesus meant…”

“We’re not meant to eat wheat.  If you knew what our paleo ancestors used to eat…”

We hear remarks like these all the time, in casual conversation on politics, religion, history, to the point of cliché.   They all share a common theme: to put it simply, things aren’t going as well today as they were yesterday, so if we can only figure out what people were doing yesterday, we can get better!

When I used to teach, I would occasionally emphasize three “rules” of history.

1.  There is no Virgin Land

2.  There are no Indigenous Peoples

3.  There is no Golden Age

I’m going to focus here on the final of the three rules (the other two I may get to later).  The quotes I opened with were fixations upon “Golden Age Mythology,” a mixture of hero worship, applying ideals of platonic perfection to the past, and being severely dissatisfied with the modern world.  They’re not necessarily false, but can be severely misguided.

On the surface, we can all think of obvious “Golden Ages” in our cultural memory that we also know were not quite so Golden.  We think of the virtue of the Roman republic while forgetting the political corruption and assassinations; the high art of the Italian Renaissance while forgetting plagues and religious oppression; the suburban family-friendliness of the 1950s while forgetting institutionalized racism and the looming threat of nuclear annihilation.  We remember the good– the 1960s had the Stones and Beatles, the 1990s had Nirvana and the Pixies!– while forgetting the bad (Herman’s Hermits and Milli Vanilli).

However, despite our acknowledgement that the past has always been less than perfect, we continue to subtlely and subconsciously assume that some times, characters, or events are so sacrosanct that they are beyond reproach and criticism by anyone from any part of the sociopolitical spectrum.

***

Deferential reverence towards our nation’s origins, particularly the U.S. Constitution and the “Founding Fathers,” is a great example of this irrational reverence for the past.  In the current (and anachronistic) debate of “were the founding fathers Christian?” both sides of the argument fundamentally want and need a bunch of white guys who have been dead for two centuries to agree with them.  Fundies misuse Jefferson’s words about the “creator” and “nature’s God” in the Declaration of Independence as some sort of evidence that he was Christian in the sense of 21st century evangelicalism.  Secularists like Richard Dawkins or even Bill Maher will quote Jefferson’s or Franklin’s words on the inutility or riduculousness of religion, as a way of making their case that these guys were somehow synonymous with modern-day internet-activist atheists.

At this point, you are probably saying, “so what DID the founding fathers really think about religon?”  Rather than spend paragraph upon paragraph upon this question (short answer: they varied, none were what we would call Dawkins-atheists, or fundamentalist evangelicals), I would respond with this: “why do you care what they thought?”

The very fact that we want to have the case closed on what Jefferson or Franklin thought of evangelical fundamentalism just shows how deeply our culture is entrenched in this reverence for the past.  These men simply HAVE to agree with us!  To win an argument, we only have to show that some respected figure from the past agreed with our stance, rather than take on the more difficult task of arguing our case on its own merits of logic and reason.

Or, as Thomas Jefferson himself said,

Some men look at constitutions with sanctimonious reverence, and deem them like the ark of the covenant, too sacred to be touched. They ascribe to the men of the preceding age a wisdom more than human, and suppose what they did to be beyond amendment…But I know also, that laws and institutions must go hand in hand with the progress of the human mind. As that becomes more developed, more enlightened, as new discoveries are made, new truths disclosed, and manners and opinions change with the change of circumstances, institutions must advance also, and keep pace with the times.

So yeah, I just used the words of Thomas Jefferson to strengthen my case for why we don’t necessarily need to use the words of Thomas Jefferson to strengthen cases.  This has got to be some sort of paradox.

internet-doctor-who

It’s the Jefferson paradox!  Get in the TARDIS before it explodes! 

But the fact remains that, if we can tie something to a sacrosanct institution, it gives our argument more merit.  Scholars almost unanimously agree that the U.S. Constitution did a terrible job of dealing with the slavery/freedom paradox that has plagued American since before we were a nation.  It swept the issue under the table, gave special favors and concessions to slaveholders (such as the 3/5 clause) that it did not give to any other special interests, all the while not even being able to even mention slavery by name within its own words.

And people knew this from day one.  As the slavery debate snowballed toward the Civil War, William Lloyd Garrison called the Constitution a “covenant with death,” and “dripping with blood.”  David Walker, a free black anti-slavery pamphleteer, openly mocked Jefferson’s “declarations” as hypocritical.

However, the anti-slavery activists whom we remember the most are those who refrained from attacking our nation’s origins, those who believed that the Constitution was foundational to the point that, if it attacked, the house above it would divide and fall (forgive me for riding the metaphor car off the track).  Frederick Douglass, on addressing a crowd of white, middle-class females in New York, made sure to temper his angry “What to the Slave is Your 4th of July?” with occasional reassurances that he respected the Constitution and our founding fathers; they were not wrong, maybe just a bit misguided, and the fault was upon us for not really knowing what they meant.

In a lot of ways, this comes down to the difference between what political scientists call “radical” and “conservative” revolutions.  Radical revolutions attack the very roots of the institution that they’re seeking to change (“The Constitution is wrong!”), while conservative revolutions seek change within these roots (“We need to rethink how we’re reading the Constitution!”).  And of course, Abraham Lincoln, the consummate conservative revolutionary, abolished slavery in a rather genius way that preserved the sancrosanctity of the Constitution.  Realizing how important it was to the cultural memory of a nation that was not even a century old, he did not suspend or overturn the Constitution, but rather forced the Southern States to secede, and then used his power as commander-in-chief to seize the human “property” of belligerents in a time of war, all while following the rules of our founding documents.  The Revolutionary Golden Age was thus preserved.

walter-sobchak

Walter Sobchak was not the only one to give a shit about the rules.

***

Jesus wouldn’t approve of the Westboro Baptist Church!  Jesus wouldn’t bomb an abortion clinic!  Jesus wouldn’t oppose gay marriage!

Even more deeply entrenched than our reverence for the revolutionary period, is the deference from all sides of the political spectrum to a person whom we know very little about.  It’s interesting; while the Right usually supports their arguments by saying “the Bible clearly says…” almost everyone from the Left– whether atheist, agnostic, or progressive– is even more guilty of Golden Age hero-worship, in which they try to point out to conservative Christians that Jesus was some sort of liberal, passively resistant hippy (the long hair and beard probably help this argument).

download

“Your Christians are so unlike your Christ.”  Ghandi also was guilty of Golden Age hero worship.

Once again, I’m not going to jump into the cesspool of selectively quoting scripture to show how Jesus-would-vote-for-my-party-therefore-WIN!  We can find passive quotes, active quotes; quotes supporting collectivism, and quote supporting free enterprise; quotes supporting and opposing statist government action, or quotes relating to any other convoluted argument that our society it fixating upon at a given moment.  Yawn.

Scholars such as Bart Ehrman and Reza Aslan have done a great job of peeling away the centuries of spoken and written mythology that surround Jesus, and laying bare the few facts that we know about the historical man.  It basically comes down to this: he was a Jewish mystic who fervently believed that the Roman Empire’s presence in Judea was fulfilling some sort of end-times prophecy, and this was more than enough to get him condemned to death.  That’s it.  We don’t know what the definitive story of his birth was, or why there are conflicting accounts of his last minutes and words.  And there is no passive resistance, no love-everyone hippitude.

american-jesus1

Or this, unfortunately.

But, because every story needs a redeeming, sacrificing hero, our entire society has gotten behind the idea that Jesus agrees with THEIR cause, and that any veering from their cause is a result of misinterpretation and misreading.  Not only do secularists attacking Christianity refrain from attacking Christ, they often go out of their way to show that they AGREE with him, to the point of fabricating evidence just as much as the fundamentalists whom they are assaulting.

***

In the end, it is just obvious that the very ways in which we tell stories and make points rely on Golden Age mythology.  Mormon faith is built upon the idea that they are reclaiming “true” Christianity that was corrupted by the Catholic Church.  Wilderness preservationists from Muir on have tried to retain some sort of static, idealized outdoor landscape that has never existed.  1960s Red Power activists emphasized that Indians before Columbus had lived in complete harmony with the earth and each other.   Free love advocates and some feminists write and talk of how all of today’s problems are tied to the hierarchical imposition of monogamy that took us away from our polyamorous, bonobous roots.  Cultural anthropologists like Margaret Mead implied that less “modern” societies such as those in Samoa could show us how our own society was before it fell from its golden age.  Paleo fiends tout how much better off we were before the rise of agriculture.  Grumpy trad climbers on supertopo.com forums angrily type away at their keyboards about how rad they were back in the Stonemaster days of Yosemite in the 1970s.  I am dead convinced that the pinnacle of popular music was during the grunge years, when I was in high school with my Nirvana t-shirt.

None of these myths that different subcultures have built around their ideologies are 100% wrong.  I love listening to grunge on my headphones while climbing trad routes in the wilderness.  But, they all fall short when they try to buttress their cases with what Jill Lepore has termed “historical fundamentalism”– the selective seizing of the broad, expansive past for purposes that fall into narrow chronological and ideological scopes.

Our entire society and Western culture is built around this, as Carolyn Merchant showed in her excellent book Reinventing Eden; and in many ways, the Garden of Eden myth is the ultimate Golden Age that we just wish we could get back to.  But we need to realize that history is more complex than just supporting single issues, and that asking the question “What would Jefferson/Jesus/Kurt Cobain/John Bachar do or think” is anti-intellectual at its heart.

That is all.  I’m going to go listen to 90s music, now.

one_nation_under_god__39928-1351554514-1280-1280__14681-1352213793-1280-1280

Blatant Self Promotion

Blatant Self Promotion

So, I finally got a book published.  It’s quite likely that I’m done with academia for good now; the job search has been dismal and I’m honestly pretty jaded with the whole process.  However, I’m still pretty proud of getting my revised dissertation published, something that most academics do once they’re trying to get tenured, but whatevaaaah.

Nelson cover comp

 

Anyway, check it out by clicking the link in the heading of this post.  Apparently, there’s going to be an ebook edition due out soon, too.

“The Soul of Black Folk”: Race, Work, and Talent

In a recent interview on NPR’s “Fresh Air,” Neal deGrasse Tyson–astrophysicist, internet icon, heir to Carl Sagan as The Great Public Scientist– made an interesting point when asked to comment on his position as a scientist who happens to be black.  Any listener could tell that he was annoyed that race was even brought up; like any self-respecting scientist (and unlike so many humanities academics, ZING!), Tyson wanted to talk about the soundness of his work, not his racial or ethnic identity.  However, when pressed on the race issue, he opened up, speaking of when he was a teenager who was both obsessed with astronomy and a talented wrestler, and encountering many teachers encouraging him to pursue wrestling rather than science.

This sort of low-level racial stereotyping is both common, and unsurprising.  Our white-dominated society has long had fewer problems with successful black entertainers (musicians, actors, athletes, from Scott Joplin to Oprah Winfrey), than with black CEOs, doctors, intellectuals, scientists, attorneys general, or presidents.  I don’t need to go into the historical or cultural reasons for this here, it’s a complex, yet dreadfully obvious phenomenon.  What I do want to get into, however, are a few more comments that Tyson hurriedly made as he was trying to steer the interview away from the topic of race, and which touch on some very interesting cultural assumptions that we have regarding nurture, nature, talent, work, and race.

In yet another section of the “Fresh Air” segment, the interviewer noted to Tyson that he had a “gift” for interpreting science to the general public; Tyson cut him off, to say, basically, “I don’t think in terms of ‘gifts,’ I think in terms of, ‘wow, I worked hard for this!’”

Now, this comment was not overtly racialized by any means, but it certainly does point toward the dichotomy of natural talent versus work and practice that comes in any pursuit–sports, music, art, writing, whatever.  We’ve all seen examples of prodigies with “gifts” who become very good at what they do very quickly.  Mozart was a musical virtuoso by the time he was five, and Picasso was painting with the technical proficiency of the Dutch Masters by his early teens.  In my own little bubble world of rock climbing, we’ve heard stories of genetically gifted folks like Dave Graham reaching the 5.13 level in the first year of his climbing career.  All of these examples are awe-inspiring, and reflect levels of accomplishment that most artists, musicians, and climbers will never reach. Graham, Picasso, Mozart, all were born with some kind of “gift.”

On the other end of the spectrum, we have people who become very good at what they do by virtue of their own hard work.  They are masters of their craft, who nonetheless are not really in danger of being labeled virtuosos or child prodigies.  There are hundreds of musicians on the classical stage, in the Broadway pit, or the Nashville studio who have practiced their instruments for hours each day for decades and attended the best academies.  They were not necessarily virtuosos at age five, but they can still play, sight-reading, anything that is placed in front of them.  Similarly, my friend Mike Anderson was certainly not climbing 5.13 within the first year of his climbing career like Graham, but through thousands of hours of training and practice, he climbs at or near the level of many professional climbers.

Obviously, for each of these examples, a great degree of natural talent is a necessary (but not solely sufficient) foundation for thousands of hours of practice.  Not everyone is physically or mentally equipped to be an in-demand studio musician or a 5.14 climber even if they do put in thousands of hours of practice.  Similarly, natural prodigies still have to put in LOTS of work; Mozart was pushed ruthlessly by his father, and Dave Graham spent every spare minute he had in a climbing gym.

***

But, I think it is significant that Tyson was so quick to point out that he got to where he was through work rather than from some sort of innate gift.  I also think it is significant that most prodigies, no matter how obviously and naturally gifted they may be, tend to emphasize the amount of work that they put into their crafts, rather than saying, “yeah, I’m just naturally really good at music, climbing, or art.”  There is a very fuzzy boundary between natural talent and hard work, between people born capable and those who become capable, but it is clear that, for the most part, our national consciousness errs on the side of lauding work over talent.  There is a reason that Tyson dismissed his gifts and emphasized his labor.

There’s a very deep back story to this. Short historical digression:

The whole basis of our American culture of Democratic Capitalism, which was planted during the Revolutionary era but came to full fruition during the Jacksonian 1820s, emphasizes it is not who you are, but what you do.  In many ways, this was a reaction against both centuries-old, stratified feudal societies of Europe, but also from early-American republic era elitism exemplified by John Adams’s and others’ ideas about “natural aristocracy.”

The idea of the Natural Aristocrat basically came down to this: although the Revolution had killed the idea of the older, European-styled inherited aristocracy, with its titles of nobility and culture of deference toward lords and kings, the reality was that some people would always be smarter, wiser, stronger, and more capable leaders than others.  Shouldn’t there be some sort of mechanism in place, both in government and society in general, to make sure that the better sorts’ voices were heard a bit more? And wouldn’t the best indicator of WHO these “better sorts,” these “natural aristocrats” were be their wealth, their education, and (dare I say it), their pedigree?

This was dangerous intellectual ground to tread upon during the era of the American Revolution, which began with Thomas Paine declaring that hereditary leadership resulted in having “An Ass for a Lion,” and then wound up with Jefferson declaring that we did not need a king, because “All Men Are Created Equal” and government should derive from “The Consent of the Governed.”  This was democratic, populist stuff, and to suggest in the 1770s that some people were more gifted than others treaded dangerously on a slippery slope towards elistism, monarchy and despotism.

But by the formal end of the Revolutionary War in the 1783, it was clear that “pure democracy,” in which every man had an equal voice and was equally capable of serving in government, was not working.  State-level legislatures were issuing their own currencies, proclaiming their own boundaries, defaulting on wartime debts, and violating treaties, and there was no strong federal government to check these “Vices of the Political System,” as James Madison called this crisis.

The U.S. Constitution, obviously, was the prescribed remedy to the ills of pure democracy. Most of us today see it as a needed concentration of power into a federal government (this is why it is ironic that so many Tea Party types, who are vehemently anti-“big government,” hold the Constitution in such reverence).  However, the Constitution was also just as much a reaction against pure democracy; it injected a large dose of classism and elitism into the populist exuberance of the revolution.  Read into it: any branch of the government that was not subject to direct election– the U.S. Senate, the Supreme Court, the Executive Branch– was intended to give Natural Aristocrats a springboard into power; a stronger voice.

However, through the first two or three decades of the 19th century, America began stepping slowly but surely away from the compromised elitism of Natural Aristocrats. There’s been plenty written on this “Rise of American Democracy,” and in my opinion, American history classes focus on it too much while ignoring the fact that this rise of “democracy” only applied to white males, and in many ways relied on decreasing the rights of all others.  Regardless, here’s what happened: the more elitist, moneyed Federalist Party collapsed; Jefferson’s Democratic-Republican Party continued to emphasize the inherent virtue of individual yeoman farmers; the Louisiana Purchase opened cheap land for individual trapping and farming; the Erie Canal and the Mississippi made it easier for farmers to get their goods to market.  Oh, and alcohol consumption skyrocketed, but that’s another story.

By the election of Andrew Jackson, we had a president who was not Ivy League-educated, did not speak Latin, was an evangelical Christian, and emphasized his frontier, backcountry roots.  From Jackson on, presidents still tended to be very wealthy, but they emphasized their “log cabin” roots over their money.  Patriotic histories and mythologies of “founding fathers” also de-emphasized the aristocratic elements of the Revolutionary generation. Schoolchildren were (and still are) taught of George Washington as a boy with an ax chopping down the cherry tree, or as a young man working hard as a surveyor, and never as the richest U.S. president in history, or his insistence at being bowed to by parishioners in his church.  Similarly, The Autobiography of Benjamin Franklin began its stint as a bible for American icons ranging from Davy Crockett to Jay Gatsby, all who preferred to focus on Franklin’s hard work as a youngster, rather than the fact that he was the wealthiest man in the U.S. by the end of his life.  Before Andrew Jackson and the rise of American Democracy, U.S. presidents were proud of being aristocrats who were either born or married into privilege.  After Jackson, regardless of wealth, it was pretty much required that presidents emphasize their hardworking, “log cabin” origins.

WilliamHarrison

Wealthy Virginia Slaveholder

roosevelt trapper

Rich kid from New York City

bush-clearing-brush

Yeeeeeah.

Fast-forward into the post-Civil War era of robber barons, monopolies, and Horatio Alger mythology, and even further into the 1980s Reagan revolution, and we can see the deep historical roots of the whole “work hard and you’ll make it” mentality that respected wealth, as well as its more vitriolic counterpart, “the poor are only poor because they’re lazy.”  It is not going too far to say that everything that defined manhood and mastery during Jacksonian democracy came down to individual agency and hard work, much more than noble title or lineage.  Everyone (at least all white males) was theoretically on a level playing field to make it in America, they just had to work hard.

This attitude percolates throughout American pop culture. Every “underdog” movie, in which an inept sports team bands together and beats their more privileged, serious opponents essentially says, “hey, in America, ANYONE can make it! You don’t need elitism, or a classical education, or structured training!”  And of course its corollary: “Why are you poor?  Ben Franklin made it!  The Jamaican bobsled team made it!  The Mighty Ducks made it!”  Next step, cuts in welfare and Medicaid, and more tax breaks for the wealthy. But I digress.

Anyway, we’ve resolved that work trumped aristocracy, but does this really fit into my earlier dichotomy of natural talent versus hard work?  Not cleanly. Boosters of upward mobility in America have long used the term “ingenuity” rather than “gifted” or “prodigy” in reference for the natural talent that has to accompany hard work in climbing up the ladder toward the American dream.  Bill Gates, Henry Ford, Andrew Carnegie, Ben Franklin, they all worked very hard to rise up to their wealth, but they were also very smart; though they would have bristled at the label, they most definitely were “natural aristocrats.”  None of these guys were as bumbling as a lovable Hollywood underdog.

Here’s the interesting thing about Tyson’s bristling at the suggestion that he had gotten where he was by some sort of “gift,” rather than his own hard work.  In one sense, he was just following the familiar American impulse of lauding upward mobility, work, and an “if I can do it, you can too” attitude.  But, like so many undercurrents of American culture, we simply can’t get away from race in this mess; Tyson may have been unusually uncomfortable at the interviewer bringing up his “natural gift” as a black man.

Here’s why: as the interview progressed, Tyson made a very good point about the 1990s, when Michael Jordan and Larry Bird dominated professional basketball. When comparing the two men, sportscasters frequently dwelled upon Jordan’s natural athletic ability, his talent, while at the same time emphasizing that Bird was a “student of basketball,” a cerebral intellectual.  Obviously, this dichotomy is overly simplified, and given the colors of the two players, there are definitely some racial stereotypes at work here. And just like the Mozart/broadway pit musician comparison, it ignores the fact that Jordan worked his ass off, and Bird had plenty of natural talent.

I’m not really familiar with team sports beyond what I’ve just said, but in the world of music, particularly jazz and blues, we see some similar stereotyping going on that equates black musicality with a natural, “from-the-heart” approach, and white musicality with studious, meticulous, “from-the-brain” work.  It’s a firmly rooted cliché– black musicians have “soul”– some intangible, indefinable “feel” to music that just knows how to incorporate blue notes, bends, syncopation, and swing into just the right parts of any song. If any white person not born in New Orleans ever tries to copy this, they will fall flat, as flat as a white guy clapping off rhythm to a gospel song!

There’s truth to this cliché. Compare Nina Simone’s version of “Feelin’ Good” with Michael Buble’s.  And it’s not even a matter or black-and-white, or even jazz, if you listen to classical piano virtuoso Daniel Barenboim’s version of “Tico Tico” against that of Mexican trumpet virtuoso Rafael Méndez.  Classical training, the type that Buble and Barenboim are coming out of, can be so clean, so sanitized, that it takes away the “soul” of musical interpretation.

Indeed, it’s even arguable that a lot of what makes up “soul” is a little sloppiness.  If you listen to the breakneck tempos of soloists like Fats Waller or Joe Pass, you can hear quite a few little flubs; they are taking risks, not always landing on their feet, but always recovering. In his book Proust Was a Neuroscientist, Jonah Lehrer actually makes the point that our brains LIKE to hear slight mistakes, or unexpected twists to familiar song patterns. Not too great of mistakes or flubs, but just enough to keep our brains on edge, not quite knowing what will come next in a song.  Especially in the case of the Barenboim piece, “Tico Tico” just sounds too perfect; not quite human enough.

Ok, so we’ve seen that there is this thing called soul, and our culture seems fixated on this idea that whites don’t have it, and blacks do.  There’s a long history of privileged whites dwelling on this, perhaps more than blacks even have.  In an ugly rumination upon the biology of race in his “Notes on the State of Virginia,” Thomas Jefferson (himself a violinist) went on and on about the musical abilities of blacks.

thomas_jefferson

In music they are more generally gifted than the whites with accurate ears for tune and time, and they have been found capable of imagining a small catch. Whether they will be equal to the composition of a more extensive run of melody, or of complicated harmony, is yet to be proved… Nature has been less bountiful to them in the endowments of the head, [but] I believe that in those of the heart she will be found to have done them justice

Later, Jefferson would observe that, although blacks were great at imitating others in music, they were not particularly original.  He was not incorrect on the imitation bit.  African American musical culture does tend to borrow and self-reference a LOT of musical themes– certain lyrical phrases and memes pop up over and over in gospel, soul, and blues; 1940s beboppers would take chord progressions of Broadway showtunes and add their own melodies to them; hiphop DJs base their whole art form around sampling previous work. But as far as not being original… I like to imagine that, if there is an afterlife, John Coltrane and Duke Ellington are both giving Jefferson a massive wedgie right now. To the tune of “Giant Steps.”

Later, around the turn of the century and the rise of industrialism, upper class Americans worried about their (white) culture becoming too civilized and sanitized, and began to pine for some sort of romanticized primitive existence that never was.  Some, like Theodore Roosevelt, believed that in order to go back to our “barbarian virtues” we needed to hunt, fight, and camp.  I think that the fixation with jazz that followed a few decades later, in which whites would go “slumming” in Harlem to listen to “jungle music” was part of this same impulse. From a position of power and privilege, whites could patronizingly say, “Yup! We’ll never be as good at singing and dancing as those colored folk!”  Slumming was a top-down response to concerns about modernity, but it was every bit as weak and temporary as those urbanites who pull off of a rural highway, inhale while saying, “Ahhh, I love the smell of the country,” and then hop right back into their air conditioned cars.

So, we see that, although there is certainly truth to the fact that there is a thing called “soul” in many forms of popular music, and that it is disproportionately represented amongst musicians of color, to laud this “soul” can easily lead one into a world of condescending patronization.

Even more important, when we say a musician has “soul,” or emphasize natural musical ability, it is often accompanied by a dismissive ignorance of the very hard work and intellectual rigor that the musician has put into his or her craft. Especially in the case of jazz, musicians certainly do not just play what they “feel,” with some magical sense of “soul” determining their every choice of note and phrase. Most of the first and second generations of jazz musicians from the 1920s to 1950s had some sort of formalized, rigorous training in their instruments.  Progressive-era and later New Deal programs funding community bands introducing youngsters to music were commonplace. Art Tatum managed to get a fine classical musical education to complement his natural talent at the Columbus School for the Blind.  Charlie Parker would occasionally quote Stravinsky in his improvised solos (including one occasion in which Stravinsky was in the audience). Though most jazz icons were obviously gifted prodigies, they also had formal, structured training; they were not relying on soul alone.  Miles Davis even went to Juilliard, with the help of his affluent dentist father!

But still, we cling to the myth that jazz, and particularly improvisation, arise from some sort of tabula rasa blank slate with only innate “soul” to shape it. I recall a story that my father, a talented musician, used to tell, in which he returned to his undergraduate institution in Utah after completing his graduate studies in Texas.  He excitedly explained to his former jazz professor the complex new world of scales, modes, and chord forms that he had learned in studying jazz improvisation at grad school.  His undergraduate jazz professor simply dismissed all of this, saying, “I don’t think that Charlie Parker was really thinking about scales.”  If you spend any amount of time really listening to Parker’s incredible solos, you will be able to tell, that, yes, he WAS thinking of scales.  However, this fact does not fit well with our widespread assumption that jazz comes only from the soul. We don’t want to think about jazz icons intricately working through scales and other cerebral, technical aspects of their crafts; we would rather think of them simply blowing from the soul, the heart– possibly with the help of booze, pot, or heroin– and producing pure brilliance.

Now, I may be oversimplifying here briefly, but let’s recap a few of the cultural assumptions that I’ve gone over: Charlie Parker was good because he had soul, classical musicians are good because they practice a lot.  Michael Jordan was good because he was a “natural,” Larry Bird was good because he thought a lot.  We can see, then, why Tyson winced a bit at the suggestion that he has a “gift” for science, and immediately jumped into emphasizing his good American work ethic. For a long time, whenever we’ve assigned a talent to African Americans, it’s probably been some sort of abstract, non-cerebral, “from-the-heart” talent that they’re just born with.

We can take this even further, though. As I said much earlier, part of the myth of American identity, specifically American masculine identity, was the opportunity to work your way above your station.  It didn’t matter who you were, it was what you did that defined your manhood. It is pretty well established in the sociology of race and slavery that part of maintaining hierarchies in a slaveholding society is to strip slave males of their masculinity.  We took away black males’ rights to protect their families, to pass their names on to their offspring, to hold property, to keep weapons, to fight other men of higher social status, ALL of these restrictions were saying not just “you are a slave,” but “you are not a man.”

Though certainly nobody thinks it when they say, “he got SOUL!” about this or that musician or athlete, this is just another minor echo of the ways in which we diminished the masculinity of black males through slavery and Jim Crow– by making patronizing statements that imply that not only do they not have a work ethic, they don’t need one.  We are essentially saying, you are good NOT because of what you do, but because of who you are. It is removing blacks from the American dream, from one of the foundational definitions of individualism, manhood and agency.

Play That Funky Music, Satan

I’m going to examine a short section of a well-known song.  This instrumental break was never innovative or influential– it didn’t brashly proclaim a new genre of jazz like Charlie Parker’s double-time alto sax break in “Night in Tunisia,” and it would never lay a sampled foundation for decades of future hip-hop and electronic music like the “Amen Break.”  However, when we consider the story, genre, artist, and cultural milieu behind Charlie Daniels’s “The Devil Went Down to Georgia” (1979), the 16-bar interlude– I’ll call it the “Devil’s Break,” in which Lucifer rips his bow across his strings with the musical equivalent of Sherman’s March to the Sea– deserves some in-depth discussion.

We’ll start with two points regarding the song’s subject matter and style: First, the song’s subject matter is a variation of a very old folk tale involving a deal with the Devil.  There are many more retellings of this story than we have time to get into here, from Goethe to Stravinsky to Robert Johnson to Zappa; suffice to say that a main character chooses to wager his soul with the Devil in exchange for knowledge, talent, a gold violin, or just plain titties’n’beer.

Second, there’s the song’s style and genre–it is a fast-paced bluegrass tune, centered around a sixteenth note fiddle reel, which for the most part follows a descending minor lamento chord progression.  It’s incredibly catchy, and the fiddle reel is obviously enough to defeat the Prince of Darkness himself.  But there is something more, something different in the Devil’s Break.  You see, when Satan

pulled the bow across the strings and it made an evil hiss.                                                                  And a band of demons joined in and it sounded something like… 

FUNK.

The tempo kicks back to a half-time feel, the key moves from harmonic minor to the Dorian mode, the guitar begins dead string comping on its upper strings, and a clavinet that sounds straight out of Stevie Wonder’s “Superstition” appears out of nowhere.  The Devil’s Break actually sounds quite similar to a sped up version of Kool and the Gang’s “Jungle Boogie.”  I’ve taken the liberty of mashing the two up just to show this.

So, quite simply, the Devil Plays Funk.  Funk loses to Bluegrass.  We need to take this further.

***

For the rest of this essay, I’m going to look at the historical and social background that shows why the funky Devil Break is culturally significant, perhaps more so than Daniels ever intended it to be.  Before we go there, however, we need to just address the simple, on-the-surface reasons that Daniels threw in this little chunk’o’funk onto his bluegrass lawn.  Several months ago, when I initially posted some of these observations and questions on facebook, my brother-in-law, the talented guitarist Jackson Evans, pointed out that Daniels probably just added the funky Devil Break because in the late 1970s, funk sold.  This is true.

The entire period from the mid 60s to late 70s was one of huge social, cultural, racial, and political upheaval in the U.S.  Music was not exempt.  This was one of the most fertile times of musical genres mixing, morphing, and synthesizing with one another, for better or for worse.  Rock groups like Queen, Kansas, Led Zeppelin, and Rush combined operatic themes and complex, epic musical forms into what would become “prog rock.” Jazz musicians like Miles Davis and George Benson moved away from the swing, traded their archtop guitars and double basses for solid body instruments, and began jazz-rock “fusion.”  Rock groups like Chicago, Cold Blood, and Wild Cherry adopted funk motifs and horn sections.  Established bands such as the Rolling Stones dipped their toes into nearly every musical genre of the decade. Blues begat soul, soul begat funk, funk begat disco, punk punched disco in the face and begat New Wave, and so on and so forth.  We all know this story.

Charlie Daniels was very much in the midst of this unprecedented musical mestizaje.  He certainly realized the paradox of being a product of The South, the most insular and culturally conservative region of the United States, but also owing his fame to a post-hippie mass interest in diverse musical niches.  In other words, he was in a great position to be an intermediary in the emerging culture wars, another product of the 1970s.

Plenty of his songs predating “The Devil Went Down to Georgia” explicitly stated this: 1971’s “Uneasy Rider” brilliantly lightened up the dark story of hippies and rednecks from the movie Easy Rider and poked fun at Southerners’ paranoia about federal incursion by the FBI and George McGovern.  1974’s “Long Haired Country Boy” breached the stoner/drinker divide and called out Southern television preachers for their hypocrisy.  Even Daniels using the term “Son of a Bitch” at the end of “The Devil” (which was changed to “Son of a Gun” for radio play) was a significant breach of Old Southern decorum.

Given Daniels’ position as a long-haired Southern Rocker, and the fertile state of musical mixing in the 1970s, it is not surprising that he would have put some funk into “The Devil Went Down to Georgia.”  It was probably just good business, and it works with the song structure.  Honestly, you’ve got to be impressed at it– funk is easy to fuse with rock, blues, or jazz, but with Appalachian Bluegrass?  That’s hard to pull off.  The tinny, nasally Scotch-Irish twang of Bluegrass is one of the culturally “whitest” genres there is, as we’ll see later on.  Yet Daniels threw some funk into it.

***

But here’s where it gets more complicated.  As a southerner, Daniels can make some jabs at his culture, but he is also viciously defensive of this culture when it is attacked by outsiders.  This is not unusual in itself; the tribal nature of most subcultures tends to allow self-mockery but defends against mockery by others.  Witness how many Jewish comedians make fun of their culture, or how many California rock bands dwell on the superficiality of their home state.  For that matter, as a cultural Mormon, I never miss an opportunity to attack that religion, but I will be the first to defend it when an uninformed outsider makes a quip about it.  Given the history and position of the South, and especially of the Appalachian South, it is not surprising that this region takes this “internal-mockery mixed with external-defense” incredibly seriously.

We see this in other songs by Daniels, which border on nationalism and jingoism.  “The South’s Gonna Do it Again,” seems to be a fairly lighthearted celebration of Southern Rockers, until you consider that the title is a play on the neo-confederate “The South Will Rise Again” mantra.  Later tunes such as “This Ain’t No Rag, It’s a Flag” and his cover of “Our God is an Awesome God” firmly place Daniels on the right of the American political spectrum.  Add others’ pieces such as Merle Haggard’s “Okie From Muscogee” Lynyrd Skynryd’s “Sweet Home Alabama,” and Loretta Lynn’s “One’s on the Way,” and we can see that while Southern musicians were happy to accept some elements of 1960s counterculture and diversity, they wished to do so on their own terms, rather than with the patronizing help of draft card burning druggies, a Canadian like Neal Young, or Gloria Steinem-styled feminists.

Charlie-Daniels

This ain’t no rag, it’s a flag!  Well, and a shirt.

This acceptance of outside elements (though not necessarily influence), combined with a more rigid defense of individual heritages is perhaps THE defining feature of cultural history in the 1970s, as Bruce Schulman has shown brilliantly in his history of the decade.  Political and legal advances of the mid-to-late 1960s insured more equal rights in the eyes of the law for disenfranchised demographics.  Non-white racial minorities such as blacks, Latinos, and American Indians gained the most, women began more tenuous steps to equal rights that would come to full fruition in the mid 70s, and even LGBT populations began a very long journey toward social and legal acceptance with the 1969 Stonewall riot.

But here’s the caveat: these movements toward de jure equality under the law in no way brought about de facto social integration; we never became a true melting pot, and MLK’s dream of  “All God’s children joining hands” never really took hold.  If anything, individual cultures became MORE insular and guarded of their identities.  Black Power, Red Power, Chicano activism, militant second-wave feminism, Harvey Milk’s San Francisco gay culture, they all said essentially the same thing: “Thanks to the 1960s, we are living under greater acceptance and freedom than ever before.  We’re going to use this freedom to keep to ourselves and strengthen our identities.”  One of the great films to come out of this era, The Godfather, was essentially an analogy for this complex process: after returning from a stint in the U.S. military, the most integrated, melting-pot, homogenized institution in America, Michael Corleone returns to his unique ethnic family identity (which in this case includes decapitating horses and shooting people in the eye).

Ok, so basically we’ve seen how all of this came together in the 1970s, and played into Daniels being in a position to funk up his bluegrass.  But there’s more to this story.  No amount of increased social, cultural, and musical heterogeny can explain this: Why does the DEVIL play funk?  In response to Lucifer’s throwing down of the funk gauntlet, the song’s hero Johnny pushes the musical genre even further toward idealized American decency in his response.  The key goes to major, and Daniels starts throwing out song title couplets that can be described as nothing less than “rural heartland” style– “Mama’s little baby loves shortening bread,” “chicken in the breadpan peckin’ out dough.”  In a sense, this is Johnny countering the Devil’s musical virtuosity with music of his “roots,” just as Daniel San, err, I mean, Ralph Macchio responded to the satanic metal shredder Steve Vai by quoting a Paganini etude in the awesomely bad 80s movie Crossroads.

However, I see this all as undeniably racialized– the Devil is playing black music, and Johnny is playing white music.  In order to see how profound this pairing of funk and the Devil Break is, we need to look at how white Anglo culture, and especially southern culture, has viewed black music over the course of a half millennium.  Time for some historical musicology.

***

Much more so than music forms like soul, blues, or jazz, the funk of the late 1960s and early 1970s was inseparable from Black Power and the aforementioned cultural entrenchment of the era.  One of the earliest funk songs, by James Brown, was the self-explanatory anthem, “Say it Loud– I’m Black and I’m Proud” (1968).  “Jungle Boogie” reclaimed the white establishment’s dismissal of African music as primitive “jungle music,” complete with whoops, grunts, and Tarzan yells.  1970s films with black heros such as Shaft or Superfly had ultra funky soundtracks by Isaac Hayes or Curtis Mayfield (no hyperlinks needed).  Hip-hop very quickly based its most important samples off of funk beats as well, and proudly owned its heavy percussion; as Chuck D. of Public Enemy shouted to kick in his historical anthem “Can’t Truss It,”  HERE COME THE DRUMS!

The most defining feature of funk may very well have been this beat.  It came indirectly from a  traditional big band swing beat, which though originally based upon the “bouncy” rhythm of 19th century marching pieces such as the Battle Hymn of the Republic, was thoroughly identified with jazz and blues by the 1920s.  Because of its association with black musicians like Ellington, Italian crooners like Sinatra, and Jewish composers like Gershwin, many saw the swing beat as music of the “other,” and subversive to WASP America (not to mention Nazi Germany) during the interwar period.  However, by the 1960s swing had become thoroughly sanitized and non-threatening, thanks to white crooners like Bobby Daren, Paul Anka, and worst of all, Pat Boone.

Funk– and the later hip-hop that sampled funk beats– put swing into a higher-gravity, plodding, half-time feel in which heavy, consistent bass and snare drums undercut the bouncing swing of a drum set’s ride cymbal or high hat.  (It’s easier to explain with example than with words; check out this very obvious switch from a traditional swing beat to a half-time funk/swing beat in Jurassic Five and Cut Chemist’s “Swingset.”)  As far as my observations can tell, funk’s “half-timing” of the swing beat started a popular music trend that rap metal (yuck) would do to fast-paced punk in the late 1990s, and that dubstep  (double yuck) would do to trance in the early 2010s.

Beyond the technical background of making a beat heavier and half-time (I think kids these days call it “dropping the bass”), the funk/hiphop beat made black music scary for whites again (it’s carnal!).  Here’s where things get really interesting.  For centuries, White Christian fears of other cultures’ music have been centered around percussion.

This percussophobia initially had less to do with blacks or Africans, and more to do with the great Other in early modern history, the Ottoman Empire.  From the fall of Constantinople in 1453 to the Battle of Vienna in 1683 and beyond, Muslim Turks represented everything that the Christian mind feared, and for good reason; they were in a position to destroy Christendom, and nearly did on occasion.  It is impossible to overstate how much fear of this Imperial Islam shaped Western European identity two centuries after the Crusades, and on the eve of the Age of Exploration.  Christopher Columbus actually promoted his western route across the Atlantic as a way for European commerce to reach East Asia while avoiding the Ottomans.  The Portuguese also expanded their trade routes while avoiding Anatolia, and then carrying out aggressive naval warfare against Muslims in the Indian Ocean.  Spaniards expelled Muslims from Iberia, forced Sephardic Jews to flee to Ottoman Turkey, and then thanked God for allowing them to reach America before Islam had.

England, the farthest-west European power, was even more wary of Ottoman Turks, and saw the empire’s despotism, slavery, and lascivious harems as diametrically opposed to English notions of individual freedom and prudence.  In promoting English colonization and spreading the Black Legend in the late 1500s, Richard Hakluyt the Younger called Spanish conquistadores “most Turkish” for their atrocities to Indians.  In satirically showing the barbarism of American slavery, Ben Franklin took the position of a Muslim.  Mozart, obviously no Englishman, nonetheless knew that the tragedy of a young woman kidnapped by Turkish pirates would be unusually poignant if she was a freedom-loving Englishwoman.  Even the fantasy world of J.R.R. Tolkien built upon this English fear of percussive hordes coming from the southeast– the armies of Mordor included darker-skinned “southrons,” and Orcs loved their drums (in 5/4, according to Peter Jackson).

orc drums

Armies of Mordor, Take Five!

The Ottoman Empire also pioneered modern military music through their powerful, percussive Janissary Corps.  This trumpet, drum, and cymbal music was intended to psychologically weaken enemies with fear even before fighting started, and it worked.  Christian Europe feared the drums of war as an instrument of the Other.  Of course, later into the 19th century, emerging European nation-states from Austria to Prussia to Britain would adopt and “Christianize” Ottoman music styles, drums and all, into their own marching bands, but heavy percussion was still seen as militaristically barbaric.  Both early classical and folk music forms avoided the drums of war.  Also by this time, Americans had found another percussive other to fear, this one from Africa.

ottoman drums

The intersection of Afro-American slavery and music is fascinatingly complex.  I have neither the time nor the knowledge here to get into details, but it is worthwhile to examine briefly how different percussion forms persisted amongst colonial slaves in different parts of the Americas, and under different European powers.

Slaves working in two giant hubs of the sugar industry, Spanish Cuba and Portuguese Brazil, developed some of the world’s most rhythmically complex music styles.  The mambo, rhumba, son, cha cha, and the samba are some of the best known.  Drawing off of the group dynamics of African tribal music, they are polyrhythmic– with multiple intertwined beats varying and coinciding with one another.  Check out the beginning of Tito Puente’s “Master Timbalero” for some polyrhythms that border on overwhelming.   African Polyrhythms could also employ call-and-response, as this Samba piece by Sergio Mendes powerfully shows.

The music was also inseparable from the hellish phenomenon of the transatlantic slave trade.  Polyrhythms were  complex in the Americas precisely because people from hundreds of different African regions and tribes, each with their own percussive styles, were uprooted, stripped of nearly everything but musical memory, and thrown together.  Even more horrifying was the nature of sugar production specifically.  It was grown in disease-ridden swamps, the sharp edges of sugarcane cut slaves’ hands, and its processing into sugar and rum took place in some of America’s first factories– furnaces and mills called engenhos (literally “engines”) by the Portuguese.  Disease and toil ensured that the average slave only lasted for seven years in the engenho, and for this reason, slave populations rarely became self-sustaining.  Well into the 19th century, around 90% of all African slaves wound up on sugar plantations.  This also meant that, compared to North America, African rhythms were more readily preserved and retained.  Polyrhythms were born in hell.

plantation1a_360

It rots your teeth, too.

The dual-definition “clave” (both a rhythm and an instrument) is even more specifically linked to sugar slavery.  The instrument is a pair of wooden sticks hit together to produce a sharp, clicking sound that carries over the cacaphony of polyrhythms; it comes in right after the bass in the above-linked Tito Puente sample.  The clave rhythm– usually played by the clave instrument, but sometimes by a cowbell, the bell of a cymbal, or even handclaps, is a constant pattern that repeats throughout a mambo, rhumba, or son tune; its asymmetry is not dissimilar from the Bo Diddley Beat, or even a simple swing high hat.  All of these simple, yet slightly asymmetrical rhythms lay a foundation, a background for more beats to layer upon.  They are the “key” to keeping complex rhythms together, and in fact, “clave” is a Spanish term for “key.”  Furthermore, as Ned Sublette’s book on Cuban music shows, the original wood stick claves were probably made from wooden pegs used in the construction of ships, and would have been very easy for slaves to obtain.  The clave, then, was the key that held together the ships that took Africans to the Americas, the polyrhythms that these slaves created, and the cohesion and identity of hundreds of disparate groups thrown together to work to death in the engenhos.

Claves

It’s a pretty hefty responsibility for a couple of sticks.

Ok, but what do the complexities of polyrhythms and the brutality of sugar slavery have to do with the Devil playing funk in Georgia?  It turns out that, especially compared to Latin America, the English-speaking colonial world developed its percussophobia very early on, and in the American South.  In his autobiography To Be, or Not… To Bop, while discussing his own forays into Latin/swing fusion, Dizzy Gillespie made the observation that the reason that African-American music never developed the rhythmic complexity of Afro-Latin music was because “the English took our drums away, unlike the Spanish.”

Chano Pozo with Dizzy Gillespie

The early years of Latin Jazz: Gillespie performing with Cuban percussionist Chano Pozo in the late 1940s.  Pozo’s story did little to dissuade Percussophobia– he was obsessed with Voodoo, and would later be fatally shot in a drug deal.

In 1739, just south of Charleston, South Carolina, several dozen slaves (many born in Africa) managed to obtain weapons from an armory, killed their masters, and burned their plantations.  They then headed south along the Stono River toward the freedom of Spanish Florida, recruiting more escapees, killing whites in their path and marching to the sound of drums.  Eyewitnesses wrote of the horror of the sound of the drums.  Unfortunately for the Africans, they never made it to Florida; a colonial militia defeated them, and the survivors were executed.  The next year, the colonial assembly began passing a series of laws called the “Negro Acts” to ensure that such a rebellion would never happen again.  They cut down on the numbers of slaves imported directly from Africa.  They mandated a more rigidly trained colonial militia (this obsession over militias to prevent slave revolts also led to the Second Amendment about a half century later).  And, they banned the use of drums by slaves.  While slave revolts certainly continued for the next 100+ years, their frequency in North America was much less than that in Latin America.

But there’s more to the story than just a single event; it’s not as simple as “Southerners were scared of drums in slave revolts, and that’s why the devil plays funk!”  We’ve already seen that the English were unusually hostile to and fearful of Ottoman Muslims– much more so than the Spanish were– and this may have contributed to their percussophobia well before Stono.  Even a casual listener can tell that English folk music is less percussive than Spanish folk music.

Furthermore, as slave communities matured in the South following the Stono rebellion, the dozens of factors that historian Philip Morgan laid out in his book Slave Counterpoint probably resulted in a gradual simplification of African American music’s rhythms, to the point that they lost most of their polyrhythmic complexity.  Tobacco, indigo, cotton, and even rice production were less lethal than sugar, resulting in more stable, self-sustaining populations with a diminishing memory of African rhythms.  Especially in more northern slave regions such as Virginia, white overseers were a more constant presence than in the malaria-infested Caribbean, and subversive, secret drumming would have been difficult.  Quite simply, in the Anglo-American world (I’m equating England and the United States for the most part regarding views toward slavery and race), African slaves were Anglicized more quickly.

Once again, we can’t get away from contrasting this with the state of percussion and black culture in Latin America.  The militaristic beating of drums was instrumental (no pun intended) to slave revolts in Brazil as well.  In fact, the Brazilian dance/martial art style of Capoeira, which combined fighting with percussion, began in escaped slave communities called Quilombos that were essentially waging war upon Portuguese settlements through the 17th century.  The guy from Sepultura even wrote a song about it.

Slave dance

Here they come, Here they come, Slave Drums.

 Yet, despite the link between percussion and rebellion in the tropics, neither Spain nor Portugal ever enacted laws equivalent to South Carolina’s Negro Acts.  As brutal as the work in tropical sugar engenhos was, the Iberian world had more laws in place to serve as a sort of minimal “safety net” for slaves than the Anglo-American world did.  Due to a tradition of a stronger monarchy going back to King Alfonso X, as well as familiarity with the extensive body of Islamic and even Roman slave law, Spain’s medieval legal code, Las Siete Partidas, placed quite a few regulations on how slave owners could treat their slaves under the Spanish Empire.  Christians could not be enslaved, but slaves should convert afterward.  Slaves could marry, and their families were protected.  They were entitled to religious holidays, to save their own money, and to purchase their own freedom.  they could even testify in courts of law, which the U.S. Supreme Court’s Dred Scott decision so blatantly shot down.  Although I’ve not yet found any specifics on percussion regulations in the Spanish colonies, it does not make sense that the modern world’s first global bureaucratic empire, with laws such as this would have ever allowed its colonies to enact something as capricious as South Carolina’s all-out ban of percussion.  The hell of sugar spawned African polyrhythmic community, but Las Siete Partidas nurtured them.

Compared to the Ibero-Latin world, England had relatively little experience dealing with non-white, non-Christian peoples during the Middle Ages.  They never shared a peninsula with Muslims, they never saw themselves as heirs to the tradition of the Roman Empire, they caricatured Ottomans as almost cartoonish villains, and they moved very quickly away from slavery and toward a light feudalism that lauded the free, property-holding “yeoman farmer.”

Ironically, this resulted in a more hands-off slave policy as mandated from the English crown, and harsher, more reactionary laws on the level of individual colonies.  Las Siete Partidas saw slaves as humans– infidel, subjugated, captured, degraded, but still human.  England, the birthplace of modern capitalism, saw slaves as simply property, as chattel.  As Edmund Morgan has shown in American Slavery, American Freedom, the tradition of individual liberty that England pioneered and American took to extremes included the freedom for a master to do with slaves as he pleased, with little government interference.

This complex comparison shows, I think, why African American slave music in the States was less percussive.  It shows why even swing, while certainly providing a platform for potential polyrhythms (it does combined double-time ¾ over a 4/4 or 2/4 time signature), had much more in common with Anglo-American folk rhythms than mambo or Samba had with Iberian folk rhythms.  It shows why even swing and jazz had their origins in New Orleans, the port city with the most direct ties to the Latin world, and where slaves from the Caribbean would meet and dance in Congo Square.

At the same time, British fiddle reels in Appalachian communities began gestating into the folk bluegrass that would eventually lead to “The Devil Went Down to Georgia.”  Regarding bluegrass, I have probably been stressing the linear continuity of “English” culture into “American” culture too much in this essay, although we’ve seen that percussophobia was definitely continuous.  Despite the stereotype of Appalachia as being isolated, insular, and even xenophobic, I cannot stress enough the heterogeneous origins of bluegrass.  The original inhabitants of Appalachian frontier, perhaps mislabeled as “Scotch-Irish,” were coming from a centuries-long background as borderlanders, familiar with English, highland Scottish, and Irish cultures as David Hackett Fischer has shown.  And bluegrass drew upon even more diversity.  By the time its name was coined in the late 1930s, instruments included the Spanish guitar, the African banjo, the Mediterranean mandolin.  Though not as blatantly as other music forms, it also incorporated elements of black music–gospel, field hollers, rhythm and blues– as well.

What’s also fascinating is that somehow, as America became more connected technologically, artistically, and culturally in the 1920s, and as more people were exposed to more musical forms than ever before, narrowly regional Appalachian mountain music somehow took the title of “real American music.”  Somehow, this twangy music played by rural folks, with its multicultural origins hidden just enough, could give a white America at its racist and nativist zenith a comfortable view into a whitewashed past of American popular culture.  It is significant that Aaron Copland drew heavily off of Appalachian motifs and Shaker hymns while trying to force some sort of true American high musical form while Ellington, Gershwin, Goodman, and dozens of other jazz composers were creating high art from immeasurably more pluralistic influences, and receiving less acclaim for it.  It is also significant that even today, when we think of the musical genres of “Americana,” or even just simple “folk,” the associations are almost always white.

And of course, mountain folk music and its Country & Western successors remained percussophobic.  Nashville’s Grand Ole Opry banned drums completely from its inception in the 1920s, gradually allowed snare drums with brushes by the 1950s, and did not allow full drum sets until 1973, in part due to the rise in popularity of the outlaw country and southern rock that Charlie Daniels was a part of.  In a famous standoff, cantankerous Texas bandleader Bob Wills, who was fusing country and jazz into Western Swing as early as the 1930s, ignored the Opry’s drum ban, lifted the curtains on his drummer unexpectedly during a performance, and was not asked to return to the venue. Much rockabilly and early rock eschewed the heavy percussion of full drum sets.  Johnny Cash’s band used a minimal brushed snare drum/cymbal combo, letting the upright bass carry the pulsing beat, and modern traditionalists like Junior Brown carry on this approach.  

Indeed, we can even define the earliest rock’n’roll by whites like Carl Perkins and Elvis  by its use of full drum sets.  The genre’s critics were percussophobes who worried about white kids being subjected to the “savage beats” of bass drums and tomtoms, and thus put on an irreversible path toward promiscuous sex, satanism, and communism.  Even in my own personal experience, last year we had a funeral service for my grandfather in a Mormon chapel.  One of his longstanding wishes had always been that we have a Dixieland style musical rendition at his funeral, but when we brought out the instruments for this, we were informed that drums and trumpets (that barbaric Ottoman combination!) were forbidden in LDS chapels, because of their “less worshipful sound.”  We channelled the spirit of Bob Wills, and brought in the devil’s instruments anyway.

voodoo-devil-drums-movie-poster-1944-1020174192

And this brings us back to the 1970s.  We can see how in one sense, Charlie Daniels was pushing the limits by even playing a bluegrass-styled song with electric guitars and drum backings, let alone bringing in the 16-bar funk break.  But at the same time, we cannot escape the fact that this guy was tying the music of Black Power to the Devil.  The historical background of racially tinged percussophobia becomes even more profound when we take into account the overall state of the nation in the late 1970s.  Daniels, that long-haired  social conservative, released this song fifteen years after the Civil Rights Act, as the white South was still reeling from the trauma of forced integration, at the dawn of the Reagan revolution, and the rise of the sunbelt.  In Daniels’s and Johnny’s Georgia, multicultural, percussive funk loses to heartland bluegrass, and the White South most definitely rises again.

Place, Growth, and Pragmatism in “The Book of Mormon” Musical

*WARNING: I am trying to serve intellectual analysis first and foremost in this post, but it will nonetheless be very offensive to active Mormons, Christians, or frog aficionados for that matter.  Offense is the whole point of the work that I am reviewing here.  You’ve been warned!*

One of the things that struck me about Trey Parker and Matt Stone’s South Park, from the moment I first saw it as an awkward 18 year old recently returned from a summer job in rural desert Southern Utah, was the series’ strong sense of regionalism and place.  South Park the town was an isolated Colorado Mountain community; something I could relate to as someone who lived in and loved the rural intermountain West.  There were local quirky characters who were undeniably western (redneck hunter Uncle Jimbo comes to mind).  They had a local “Cow Days” festival (just like Richmond, UT has a Black and White Days!).  The town’s insular but tight sense of community was most evident looking down its Main Street: a small bar, bizarre local businesses like “Tom’s Rhinoplasty,” and perhaps the single, local police figure, Officer Barbrady.  This cartoon location was not a vaguely Midwestern Anytown, USA, as the Simpsons’ Springfield is, and certainly not one of the stock northeastern or West Coast locations that most series, animated or live, rely upon.  It was WESTERN.  Intermountain, frontier, autonomous, peripheral, isolated.  Take away the fart jokes, and Stegner or Turner would be proud.

jimbo and ned

You see, Ned, our sense of place and rugged frontier individualism is key to preserving American Democracy!

But this changed over time from the late 1990s to the 2010s.  Like so many main streets in the rural West, South Park’s became more like the rest of the United States.  Whether as a conscious commentary, or out of Parker’s and Stone’s need for more material, South Park gradually got strip malls, Walmarts, Apple Stores, racial diversity,franchise restaurants, late-model Suburus, film festivals, and 4g coverage.  The single town policeman was replaced by a full force of gritty, urban, vaguely Irish cops.  South Park as an Island Community was no more.

This transition from the quaintly insular toward mainstream integration in the West is not just evident in small western towns like South Park.  It is one of the broad stories of a strongly regional religion a few mountain ranges to South Park’s west: The Church of Jesus Christ of Latter-day Saints (Mormons).  In a process that continues to fascinate and perplex me, an early 19th century millenarian separatist cult has, via very rough road at times, made the journey to being a more mainstream, center-right “Family Values”-type religion, that tries to retain relevance (but also distinction) in the globalized 21st century.

Mormonism has been a recurring topic in the satirical art of Stone and Parker: their first musical, Cannibal: The Musical featured Mormon settlers briefly, their film Orgazmo told the story of a Mormon missionary-turned pornstar, and the faith appeared occasionally in South Park (heaven is populated by bike-helmet wearing missionaries obsessed with arts and crafts, and later an entire episode is devoted to a Mormon family that moves to town).  I wondered early on if Parker or Stone had been raised Mormon; they certainly had enough of a grasp of both theological and cultural quirks to have been.  It turns out that, no, neither had been Mormon, but they had grown up with quite a few.

So, it was unsurprising when the duo created their magnum opus, the Broadway musical The Book of Mormon.  It is touring off-broadway right now– seriously, go see it.  This Tony-winning work has been analyzed and reviewed into the ground by this point, and, along with the candidacy of Mitt Romney in 2012, has been perhaps the most significant pop cultural force bringing Mormonism to national attention since the Osmonds.  It is brutal in its satire, and makes no qualms that the premises of Mormonism’s unique scripture, The Book of Mormon, are utterly ridiculous.  But at the same time, the musical manages somehow to be polite, and nice to the religion.

newsweek-cover-mormons

The LDS Church responded with civility to the musical’s success: they initiated an ad campaign inviting people to visit their website (there are even LDS ads in the musical’s playbill); devout members politely made comments about how the musical was misunderstanding their culture.  But that was about the extent of controversy.  In today’s age of religious extremism, many media commentators noted that, had a musical about the origins of Islam been written, the religious response would have been much less polite, and probably violent.  Of course this is true.  Fewer people noted that, had a musical taken similar lighthearted aim at, say, Southern Evangelical Christians, the religious response would have also been vitriolic and possibly violent.

The broad underlying theme of the whole musical are the ways in which it shows the problems of Mormonism’s transition (and attempts at transition) from peripheral to integrated, from regional to international, insular to all-encompassing; just as South Park moved from Main Street to franchises.  So much of this is tied to the fact that Mormonism is not simply a religion.  In Utah, it is a distinct culture.  What happens when a distinct and very provincial culture that is so tied to place (the intermountain West), originated in a specific and recent time (Second Great Awakening and Jacksonian America), and was based off of some VERY bizarre and specific “prophecies” attempts to become universal, say, by sending Utahn missionaries to early 21st century Uganda?  This is the theme of The Book of Mormon musical. (note: through this essay, I’l be referring to both the musical and the Mormon scripture as italicized Book of Mormon, but I will specify when referring to the musical.  Pay attention.)

I’ll start with some minor points.

The musical certainly contains plenty of jabs at Mormon culture, all of which I thought were quite apt.  One of the opening numbers features naïve 19 year old males at the MTC (Missionary Training Center), accepting their various mission assignments with blissful joy– “Norway?  Home of Gnomes!”  “Uganda?  Like the Lion King!”  With the  odd-couple pairing of the two protagonists– standout golden boy Elder Price, and overweight, compulsive liar Elder Cunningham, we even get a great look into how the Mormon social hierarchy works.  Price sings about how the pair is going to do great things, but “mostly me!” while Cunningham happily accepts his inferior status as a sidekick.  This hit close to home for me, as I recalled so many church talks on how every member was valuable and important, but that the Lord had ordained natural leaders who were “blessed” with “the spirit” to call the shots just a bit more than the rest.  These natural leaders, coincidently enough, were usually the tall, the blonde, the beardless, the athletic. They would be troop patrol leaders in boy scouts, seminary council members in the church education system, and go on to serve as Bishops and Stake Presidents in their adulthood.  The “Mostly Me” song nailed this subtle hierarchical paradox, in which all adult males  hold “the priesthood,” but only the business-like, clean-shaven, republican, upper-middle class types would move up the ranks of the lay clergy.  It is clear: Price has it, Cunningham does not.  (Mitt Romney also has, or had it)

The highpoint of the musical’s first half was the song and dance number “Turn it off,” in which the Greek Chorus of missionaries instructs the newcomers Cunningham and Price to simply ignore the bad stuff of sub-Saharan African– “turn it off, like a light switch!”  This starts of in reference to a murder by a warlord that the missionaries witnessed, but we all know what the “turn it off” mentality is really directed to (it’s THE GAY!), as various missionaries sing their testimonials of lusty same-sex attraction (or for that matter opposite sex attraction), and then end by saying “I turned it off!”  The irony of seeing a sharply dressed, tightly choreographed troupe of oh-so-gay actors dressed as missionaries sing this was over-the-top awesome, but again, at the root this stuff hit home.  You want to know how deep this “turn it off” mentality goes?  Just read Apostle Mark Peterson’s advice to young boys dealing with the evil temptation of masturbation.

Beyond some of these cultural references to provincial sexual repression, the musical most definitely points at some of the most obvious logical and historical problems with Mormonism’s origins and the Book of Mormon scripture.  Again, Parker and Stone have long been very adept at showcasing the ridiculous nature of Mormonism’s origins as perceived by outsiders– see South Park’s treatment of the Martin and Lucy Harris story for a great pre-musical example.  But, in both South Park and in The Book of Mormon, Parker and Stone are less concerned with the detailed logical problems of early Mormonism (no golden plates were ever seen, Joseph Smith gave multiple versions of his “first vision” story, and so on and so forth), and more with the ways in which ridiculous beliefs can help people do very good things.

It is in this theme that the pathological liar, Elder Cunningham, really shines forth.  Upon thinking (or realizing) that the traditional missionary discussions from his Provo training are simply not engaging his Ugandan flock, Cunningham simply starts making stuff up.  He combines Mormon cosmology and canon into a pop culture mashup of Star Wars, The Lord of the Rings,  and frog-fucking (bear with me here), ultimately empowering the Ugandans to join the Church and overthrow their oppressive warlord.

One exchange between Cunningham and a villager who wishes to copulate with a baby to cure his AIDS goes like this:

Uhhh, Behold! The LORD said to the Mormon prophet Joseph Smith, “You shall NOT have sex with that infant!” And lo Joseph said, “Why not, LORD? Huh? Why not?” And the LORD said, “if you lay with that infant, you shall” [makes an explosive sound] burn in the fiery pits of- Mordor!  … A baby cannot cure your illness, Joseph Smith. I shall give unto you a… a FROG.” And thus, Joseph laid with the frog, and his AIDS was no more!

(full transcript of the musical is available here)

In this sense, Cunningham is very much a mirror of Mormonism’s founder, Joseph Smith.  As the genius Smith put together the epic narrative of the Book of Mormon (this time the Mormon scripture, not the musical), elements of his own time and place made their way into the story.  Smith did not draw upon Tolkien or Star Wars, but he did slip references to paranoia about freemasonry into his story of the “Gadianton Robbers” and their “secret combination” hand signals (see Faun Brodie’s No Man Knows My History for more details on this).

More broadly, the entire Book of Mormon responded to a pressing cultural need of Americans that other religions had thus far ignored: if America was the greatest, most God-blessed nation on earth, and if the Bible was the literal and comprehensive word of God, why did the Bible not mention America?  Smith responded with perhaps one of the earliest occasions of “fan fiction” in America– “You like the Bible?  Then you’ll LOVE the Book of Mormon!  There are biblical Israelites, but in AMERICA!”  Or, in the slightly more restrained words of Smith biographer and Mormon elder Richard Bushman, Smith gave “America scriptural legitimacy.”

So, by this point in my narrative of amphibious copulation and sociopathic lying,  all devout Mormon readers are probably thoroughly disgusted.  But here’s the thing: When Cunningham threatens the villagers with eternal damnation into the pits of Mordor, or makes up a story of Bobba Fett turning frog fuckers into frogs themselves, he has departed from traditional Mormonism, but also responds to the concerns and needs of the Ugandans with sensational, epic stories that are not entirely his, but that are packaged by his raw talent and charisma.  When his missionary leaders find out about his shenanigans, they are horrified at this perversion of their faith and harshly chastise him.

Similarly, Smith’s own scriptures and religion– with later tenets such as Adam-God Doctrine, Blood Atonement, and most of all polygamy– were a radical departure from traditional Christianity as well, which nonetheless appealed to many Americans, not least because of Smith’s creativity and charisma.  But, here is the most important point: mainstream Protestant Christians of the early to mid-19th century were just as horrified at what Smith was saying about polygamy as modern Mormons would be at Cunningham preaching about Smith fucking a frog.

What Parker and Stone are saying is, “yes, Smith may have been a sociopath, sexual predator, and a liar, but he created a faith and culture that helped people feel good about themselves.  So what?”  A variation of this statement also occasionally appears in South Park episodes, both in the aforementioned Mormon show, and in a later one that asserts in its conclusion: “Who cares if Jesus ever existed or not?  He has influenced more people than most other REAL historical characters!”  Ultimately, Parker and Stone have come to the same conclusion that William James came to in his pragmatist work The Varieties of Religious Experience: religion is valuable for its psychological benefit to the individual and the community.  Whether it is “true” or not is irrelevant, and not even really that interesting.

Furthermore, Stone and Parker’s particular brand of satirical humor strives to mock those things in life that are difficult to mock; they do not shoot fish in barrels.  They transcend the tired divides of today’s culture wars– secular vs. religious, conservative vs. liberal.  It would be very easy– too easy for Parker and Stone– to take the smug, Richard Dawkins-styled approach of “haha, these people are stupid for believing that Indians are descended from Hebrews who were cursed with red skin for wickedness!”  Instead, Parker and Stone have long taken it a further step, mocked religion, and then turned around and mocked Dawkins (as well as all other smug liberals).  In this sense, they are very much a continuation of the sort of non-partisan, libertarian satire that Frank Zappa began fifty years ago when he attacked both the stupidity of both religious conservatives and stoned out hippies.

zappa

There is more stupidity than hydrogen in the universe, and it has a longer shelf life.  

–Frank Zappa

But I digress.  There are problems with Parker and Stone’s pragmatist view of religion being valuable for its individual psychological benefit to its members, and especially in the case of Mormonism.  As it began, and as it continues to be, Mormonism (and especially Smith) has always been unusually focused on the literal and the material, and averse to the symbolic and the metaphor.  Smith claimed to have translated his scripture from LITERAL plates of gold, rather than simply have it revealed to him as Allah supposedly did to Muhammad.  The Garden of Eden was LITERALLY located in Missouri.  Heaven was not an otherly realm or dimension, but LITERALLY another planet.  Salvation was less spiritual and ethereal– your soul was not redeemed after death, but you LITERALLY regained a body of “flesh and bone.”  When you converted to Mormonism, your blood would LITERALLY turn to that of an Israelite.  God had once LITERALLY been a man, and LITERALLY had sex with his plural wives to create little spirit babies, including Jesus and Lucifer, who were LITERAL brothers.  Do we see a pattern here?

But the problem is that once you hinge your beliefs on the true and the material, rather than the spiritual or the metaphorical, they become vulnerable to scrutiny and being disproved.  This has been Mormonism’s major challenge as it has moved from 19th century Utah into the 20th and 21st century globalized world.  Most of the above doctrines, while not officially abandoned (you can’t ever fully abandon something that a prophet has said), have been  gradually ignored and de-emphasized.  As former church President Gordon B. Hinckley vaguely said in response to an interview question about Mormon men eventually becoming gods, “I don’t know that we teach that anymore.”

But there is one literal “truth” that no good Mormon can publicly deny or even sweep under the rug (though I suspect that many have their private doubts), despite its obvious archaeological, historical, and now genetic weakness to any casual outsider: the literal and unqualified veracity of the Book of Mormon’s story of Israelites living in the Americas from around 800 BCE to 300 CE.  As all devout Mormons state every time they “bear their testimony”– “I KNOW (“know,” not “believe”) that the Book of Mormon is true.”  Literally true.  Similar to fundamentalist Christians who base their faith on the unquestionable truth of the Bible (which, ironically enough, Mormons only believe “as far as it is translated correctly,” part of being a good Mormon is a belief in the 100% truth of the entire Book of Mormon.

This is where the true strength of the Book of Mormon musical lies.  It presents “truths” that are obviously ridiculous, impossible to defend, and rolls with them.  At the end of the musical, the Ugandan villagers state, “of course we know that the stories of Joseph Smith frog-fucking are ridiculous!  What, do you think we are morons?”  It is ridiculous to believe that fucking a frog cures AIDS, just as it is ridiculous to believe that white-skinned Israelites populated the New World, or that your family’s eternal salvation rests upon dressing in silly garments and performing variations of Masonic rituals behind the closed doors of temples.

Yet, they have found strength in these “false” stories.  The movement from the literal to the metaphorical is even seen in the realization that they may never make it to the Promised Land of Salt Lake City (or, as the Ugandans call it, “Sal Tlay Ka Siti”).  The land of Milk and Honey in Salt Lake City is eventually seen as less an actual place, and more a state of mind, not unlike Augustine’s much earlier declaration that the City of God dwells in all of us, rather than being a specific place.

In other words, the ridiculous can be used for the greater good, and The Book of Mormon manages in this way to steer clear of attacking Mormonism for the sake of vitriol or smugness.  As Mormonism, which touts itself as one of the world’s “fastest growing religions” (this may be true for the US, but internationally, Mormon growth pales to Muslim birthrates) continues to try to become global while retaining its regional roots, it is going to have to keep dismissing, ignoring, de-emphasizing, but not disavowing or condemning many of its bizarre theological origins (Mormon beliefs about all African blood being cursed, which were finally changed as LDS missions began moving into Brazil are the most obvious example).  The greatest paradox, however, is that all religions exist to provide templates of human behavior that are seen as timeless, constant, and unchanging, but they can only stay relevant if they change, and evolve according to culture and place.

But let’s just all agree not to do this, though, ok?