Walking to Mordor

June 3, 2014 · Print This Article

My wife and sometime collaborator Stephanie Burke and I recently completed a 140-mile walk as a perforance piece called “Walking to Mordor.”  The walk was based on an Easter egg introduced in Google Maps three years ago:  if you asked it for walking directions from “The Shire” to “Mordor,” instead of the usual “Walking directions are in beta” warning, a pop up announced, “Caution:  One Does Not Simply Walk Into Mordor.”  The line is Boromir’s, from The Fellowship of the Ring.  Ignoring his naysaying, the two hobbits Sam and Frodo proceed to do exactly that.

The line, as spoken in the 2001 film, spawned an Internet meme which consisted of a still image of Boromir, hand in mid gesture, coupled with a line of text reading, “One does not simply…” followed by whatever the author wished to decry.  Instances date back to at least 2004.  In 2011, Google Maps joined the party by adding the Easter egg to their walking directions.  Along with the warning, however, Google actually did provide a map and directions, from a restaurant called “The Shire,” in Chehalis, Washington, to a tattoo shop called “Mordor Tattoo,” in Arlington, Washington, 138 miles away.

When I showed Stephanie the joke, she mentioned that, coincidentally, she has family in Chehalis, and had spent some time there growing up.  It didn’t take long for us to decide that it would be fun, and funny, to take Google Maps’ directions at face value, and walk the route.  Almost immediately thereafter we realized we had to commemorate the journey by getting tattoos at Mordor, and that the tattoos should be of the map of the route.  We documented the project with a series of photographs called “Instagram vs. Holga.”  Stephanie, a trained photographer, shot on the cult classic crappy medium format film camera, while I, with  no more than a couple of undergraduate photography classes under my belt, used my phone’s camera and the everyman’s favorite app.

As has happened with more than one previous project, we didn’t set out to make art.  Our process is more often that we have an idea for something we’d like to do, and then, almost against our wills, we realize that it is starting to look quite a bit like art.  Or at least like things that other people call art.  And certainly, going for a long walk has quite a history as a form of performance art.  It has spawned books, blogs, and even a society.  Well-known examples include Francis Alÿs,Regina José Galindo, Simon Faithfull.

The history of walking as a form of performance art can never be severed from its history as a form of protest.  Galindo’s 2003 walk from the Congress of Guatemala to the National Palace, her feet dipped in blood to leave red footprints, was intended as a protest against Guatemala’s former dictator, José Efraín Ríos Montt.  Montt had formerly led a military regime known for widespread human rights abuses, and at the time of Galindo’s performance was running for President in a democratic election.

Not all of those who have walked in protest have identified as artists.  Perhaps the most famous example, internationally, is Ghandi’s Salt March or Salt Satyagraha.  By directly and pointedly disobeying a British law against domestic salt production in India (forcing Indians to buy imported British salt), the march essentially started what became the international Civil Disobedience Movement.

Inspired by Ghandi, A. Phillip Randolph and Bayard Rustin organized the 1963 March on Washington for Jobs and Freedom.  The march itself covered barely more than a mile, from the Washington Monument to the Lincoln Memorial, though the 250,000 participants (60,000 of them white) had traveled from much farther away by bus, rail, and plane.  Some spent 20 or more hours on buses traveling as far as 750 miles.  Two years later, voting rights activists marched 54 miles, from Selma, Alabama to the state capitol in Montgomery.  The Selma to Mongomery marches are commemorated by a National Historic Trail.

America’s racial history (obviously still in the making) continues to inspire performance artists.  In 2009 I reviewed Meg Onli’s Underground Railroad project for Art Talk Chicago.  (Five years later, her work holds up better than my early efforts at writing.)  Presented as part of Twelve Galleries Project and curated by Jamilee Polson (who is also this blog’s managing editor), Onli’s project consisted of her retracing, on foot, the route of the Underground Railroad: a 440-mile journey, in Meg’s words, “in search of blackness.”

Exploring another form of blackness entirely, Chicago-based curator Amelia Ishmael co-edits Helvete, a journal of Black Metal theory, in the first issue of which was published David Prescott-Steed’s “Frostbite On My Feet:  Representations of Walking In Black Metal Visual Culture.”  (If you’d like to read the article for yourself, the entire journal is presented for free, as a downloadable PDF, at the above link.  A print edition, also available, is well worth the price.)  “Frostbite” tracks a few reference points linking walking with Black Metal culture.  Principally, it finds the common ground between a grueling trek into the Norwegian tundra, led by Gaahl (former Gorgoroth frontman), and the author’s own experience walking the mundane streets of an Australian metropolis while listening to Burzum:

In this case, “blackened walking” is seen to be less about the activity of walking itself and more about the circumstances under which one can move through space—walking not just for the sake of exercise, pleasure, or getting to the shops on time. With the modern world (invested in trains, planes, and automobiles), the slow, simplicity of a walk (Walking? How pedestrian!) seems to have lost some of its value. However, walking is capable of bringing one’s focus back to a fundamental question of what a body physically needs to do in order to transition through, and therefore go on, in the world. Perhaps mourning the forgetting of the existential significance of walking, “blackened walking” pays respects to walking as the chance to explore self-determination and a readiness for the unknown.

We hadn’t conceived of the “Walking To Mordor” project initially in terms of its connection to Black Metal, but as we walked, Prescott-Steed’s phrase “blackened walking” echoed in my mind.  The connection, however ephemeral, clarified itself in my mind as I looked over Tolkien’s maps of Middle Earth, and researched his languages.  Two of the bands mentioned in “Frostbite” take their names from Tolkien’s writing.  Gorgoroth is an arid plateau in the northwest corner of Mordor, surrounding Mount Doom; the name comes from Sindarin (the Gray Elven tongue) and means “dreadful horror.”  The name of another band, Burzum, means “darkness” in the Black Speech of Mordor.

Far from the tradition of protest marches, whether as performance art or otherwise, “Walking To Mordor” was in some was a playful exploration of what happens when a joke is taken 138 miles too far.  A linguist became an author.  His book became a movie.  The movie spawned a joke.  The joke became a meme.  The meme became an Easter Egg embedded in the principal means by which Americans today naviage their world.  With every breath spitting in the face of Alfred Korzybski, originator of the phrase, “the map is not the territory,” most of us today confuse a glance at Google Maps, followed by a drive in the car, with exploration.    We think of distances first in minutes of driving, or hours of flight.  The landmarks we note are gas stations and Starbucks locations.  Google Maps has become the average person’s understanding of the world.  Moreover, our culture is becoming one of remakes and mashups.  References have taken the place of wit:  “that’s clever” has been replaced with “I have heard that before.”  Tolkien has been reduced, in the public imagination, to the origin of nerd-chic Internet memes, and we have tried in our way to be true to his work by dragging a piece of derivative humor, kicking and screaming, into meatspace.

Skin and Bones: Taxidermy as Fine Art

May 6, 2014 · Print This Article

Brooke Weson, “Jonas Denver.”

This past weekend was the opening reception for the third biennial taxidermy exhibition at La Luz De Jesus in Los Angeles.  The exhibition included works from Jessica Joslin, Simone Smith, Divya Anantharaman, Emily Binard, Sarina Brewer, Kristin Bunyard, Kevin Clarke, Catherine Coan, Cindy Cronk, Bruce Eichelberger, Ai Honda, Katie Innamorato, Jeremy Johnson, Lauren Kane, Jeffrey R. Kibbe, Dr. Paul Koudounaris, Brian Poor, Emi Slade, Nick Veasey, VegA, and  Brooke Weston.

Emi Slade, “Arctic Merfox.”

The works in this exhibition span a variety of approaches.  Jessica Joslin’s constructions bring new personality to animal skulls by adorning them with glass eyes and vintage metal.  The adornments evoke jewelry, ears, wings, etc., and give each skull a new identity.  The titles of the pieces are these new characters’ names:  “Butch”, “Star”, “Annabel”.  The use of vintage metal, as with Chicago’s own Jason Brammer, draws inevitable associations with Steampunk, the subcultural aesthetic William Gibson brilliantly described as, “when Goths discovered brown.”  And it is admittedly difficult not to read Butch, with his underbite and spiked helmet, as one of the goblin guards from Labyrinth.  But these superficial associations, besides being inevitably annoying to the artist, are a distraction from the unique characters that are all Joslin’s own.

Jessica Joslin, “Butch.”

 

The word “charming” comes up a lot in attempting to describe the works throughout the exhibition.  It is a near universal that, like Joslin, artists working with taxidermy will create characters with an endearing personality.  Some, like Emi Slade, create threatening monsters that evoke the good old days of physical special effects creature features like Jaws or Critters.  But others go in the opposite direction, creating little animal friends with whom you’d be delighted to spend an afternoon.  Simone Smith’s “Dinner Underground” is a perfect example.  Taxidermied moles enjoy a meal of snails in their subterranean parlour.  From the upturned bottles and tilting glasses, it’s hard not to imagine that the little fellows have enjoyed a few glasses.  Sometimes art is about big ideas.  Other times, it’s as simple and funny as enjoying the idea of a couple of moles getting shitfaced.

Simone Smith, “Dinner Underground.”

I Hope There’s Drugs In Heaven. (Rest In Peace, Dave Brockie.)

April 7, 2014 · Print This Article

Dave Brockie, better known as Oderus Urungus, frontman of the band GWAR, passed away on March 23rd. He was found dead in his Richmond home by a fellow band member.  As of this writing, murder and suicide have been ruled out as causes of Brockie’s death,while drugs are still being considered a possibility. Drugs seem likely.  Drugs featured prominently in the band’s lyrics (which may not be significant, considering that necrophilia, bestiality, and mass murder were common themes as well) and in Brockie’s autobiographical writing as well.  According to police, there was evidence of drug use at the scene.  While the official autopsy report is yet to be released, it seems probably that Brockie died of a drug overdose.

Much is often made, in the wake of a celebrity’s death, and especially a premature death to drugs or suicide, of what lesson we might learn, of the pressures of fame, the ills of society, and so on.  We are asked what lesson we might learn, and also (often as we are being asked for a contribution to a foundation) what the celebrity would have wanted.  Of course, on one level it’s irrelevant:  the celebrity is dead, and so their wishes are irrelevant.  Funerals are for the living.  I never knew Dave personally, but if you ask me what lesson he’d want us to learn from his death, I’d say, “Not a damned thing.”  He’d want us to steal his corpse from the medical examiner’s office and have sex with it.

GWAR was started in the 1980s by a group of art students at Virginia Commonwealth University.  Hunter Jackson was a VCU student working on a film called Scumdogs of the Universe (later to be used as the title of GWAR’s second album). Brockie was the singer for a punk band called Death Piggy.  Jackson (better known to GWAR fans as Techno Destructo) was using an old warehouse to film his movie; Death Piggy rehearsed in the same warehouse.  The two got to know each other, and GWAR was born.  (Sort of.  As is generally the case, the truth is a lot more complicated, but that’s the short version.

I haven’t been able to confirm whether or not Dave Brockie was himself enrolled at VCU, but many of the founding memers of GWAR were, including Jackson, and Chuck Varga (who performs in GWAR as Sexecutioner).  In a 1994 interview with Live Wire Magazine, Varga talked about leaving the fine art path to join GWAR:  “I went to college, I went the fine art route, and it really turned me off.  I was really creative, but at the same time, I wasn’t into fine or commercial art. It seemed like art was really a dead end thing to get into.  I was hanging around with Hunter (Jackson, Techno Destructo when he’s around, “a lowly slave” when he’s not) Dave (Brockie, Oderus Urungus, the vocalist), who were totally crazy, much like myself. They totally reviled in comic books and movies, and II kind of looked at myself and said, ‘I’ve always been into that! I don’t need a bunch of goddamned museum bullshit!’ So I had a rebirth in a way, forget everything I learned in college, and I started to learn about a totally different science of special effects and props.” (http://spookykids.net/gwar/gwarpage/Unmasked.html)

He could have gone for general. He went for himself instead.

More than any cautionary tale about drugs and the stereotypes of the rock and roll lifestyle, the lesson to take away from Dave Brockie’s death is to look at his life, and the lives of his bandmates, past and present, living and dead.  A nineteen year old punk singer from Canada, Brockie met some art students who were tried of trying to make it in what by 1985 they were already seeing as an overly repressive and stagnated art world. Though they would probably have simultaneously shat and vomited at the language, what they did next was a finer piece of interdisciplinary, collaborative, relational aesthetics than most projects to be so called.  They presaged the rough aesthetics of Nathalie Djurburg (http://www.lissongallery.com/artists/nathalie-djurberg-hans-berg/gallery) and the wet, sticky grotesque of Gregory Jacobsen (http://gregoryjacobsen.com/). Under the rotted surface, their work contained a subtle and no-one-is-safe political satire, like an X-rated version of Vermont’s Bread and Puppet.  And it all started when a punk singer and some art students decided that instead of banging their heads against the ceilings in their respective fields, they’d strap on some big rubber dicks and go for broke.

In Which An American President Explains To An Android Why It’s Wrong To Shoot God With A Bazooka

March 3, 2014 · Print This Article

There is an episode of Star Trek: The Next Generation in which Lt. Cmdr. Data expresses to the rest of the crew his puzzlement at the human fascination with “old things.” The crew were probably trying to save some ancient ruins or encountering a relic from the past (probably a shoutout to the original series, like the wreck of the old Enterprise or something). It is, if you think about it, an odd notion. Why is something made a thousand years more interesting than something made yesterday? (With the penchant for clever, punny titles of panel sessions at CAA, if there hasn’t yet been, there will almost certainly eventually be, an art history panel called “Lascaux to Last Week,” probably about contemporary cave paintings or appropriating ancient imagery.) [Note: Apparently it’s a book. I thought I’d heard that somewhere. http://www.percontra.net/archive/3lascauxtolastweek.htm]

Art History has had a couple of moments in the spotlight recently. The College Art Association conference just took place in Chicago, and for those in studio art fields who attend, it’s maybe more exposure to art history than we get, unless we actively seek it out, during the rest of the year. (The conference has a history of some animosity between the two disciplines; from what I’ve gathered it was more art history focused in the past, and in recent years studio art has been taking over, affecting everything from the book and trade fair to the location of the conference itself.)

The CAA conference isn’t universally loved, or even respected, by visual artists. My friend and colleague, painter Steve Amos, posted to Facebook: “Beware of the foul smell emanating from the South Loop; the pile of bullshit known as the College Art Association conference is in town.” (Posted February 14th to Facebook: https://www.facebook.com/steveamos/posts/10151952963102919?stream_ref=10.)

I didn’t ask Steve what he meant or why he felt that way, but I’ve heard the sentiment echoed among many of my friends, and may have said something along those lines myself, in a moment of frustration. Some of the hate may come from a frustration with the job market, and a treating of the conference as synonymous with the Career Services aspect thereof. The Interview Hall and Candidate Center are certainly geared towards job seekers. I know some people who have gotten jobs through interviews at CAA, and others who have gotten interviews. Personally, I’ve never been interviewed at CAA, though their career services have helped me in other ways: almost every job for which I’ve applied was listed on CAA (other listing sites include Higher Ed Jobs, The Chronicle of Higher Education, and Academic Keys), and their mock interviews and packet reviews helped me prepare for the application and interview process for my current position. (Since August of 2013 I’ve been teaching full time at Northern Arizona University.)

Another recent spotlight on art history was the film Monuments Men, in which some art experts get drafted into WWII to “tell our boys what they can and can’t blow up.” It was a true story (an interview with one of the surviving, original Monuments Men was featured recently on NPR), and a lot of masterpieces in European collections survive today only because of these men. (Others, such as an Italian monastery, were bombed out of supposed military necessity.) My friend and colleague, Chicago artist Renee Prisble, asked on Facebook (via Twitter), “Where were ‘The Monuments Men’ when we invaded Iraq?” (Posted to Facebook January 27th, via Twitter: https://www.facebook.com/reneeprisble/posts/10203102149818529?stream_ref=10.)

The Ufizzi Gallery in Florence during WWII. Sculpture, including Michelangelo’s David, are behind brick domes intended to protect them from bomb blasts and fragments.

It’s a fair question, one that was asked plenty at the time (or, rather, immediately after the looting of the museum), although mostly among the NPR set (myself included). There’s an image, I can still see it, of the facade of the museum sporting a hole created by a round from the cannon of a main battle tank. In this case the Americans clearly caused the damage by invading, even though it was primarily locals who did the looting (as opposed to the WWII example, in which invading Nazis themselves were the looters).

Two years earlier, just before 9/11, in the summer of 2001, the Taliban had used rockets and explosives to destroy the Baniyam Buddhas of Afghanistan, a resurgence of the age-old iconoclastic prohibition. Iconoclasm is based on Mosiac law (i.e. the Old Testament generally, and specifically the Ten Commandments), and thus is common to the history of Islam, Christianity, and Judaism, although within each faith sects vary widely in how literally they interpret this. Islamic Fundamentalism is among the most vehement, its leaders sometimes issuing death threats against people who depict Mohammed. The Taliban followed in this tradition when they chose to destroy the pair of 6th Century monumental sculptures of the Buddha, carved into a cliff face. (Mosaic law can be interpreted as instructing its followers not to make any representational imagery whatsoever, or more narrowly not to represent prophets and deities; in this case it was extended to destroying ancient monuments made my followers of another religion.)

The tragedy of this destruction is central to answering Data’s question: why was it such a big deal? Merely because the statues were old? Or because they were a symbol of a faith different than that of their destroyers, and we in the West have a live-and-let-live, relativist attitude? I don’t have the answer to this, but certainly our fascination with old things, as well as our respect for other cultures, is central to the role of art history.

It would be disingenuous to treat art history as totally synonymous with preservation. Certainly conservation, preservation, and repatriation of lost or stolen works is a role that requires the asssistance of an art historian. But the bread and butter of art history is study and interpretation. I described it in my own prediction for what I’d see at the College Art Association conference: “A bunch of new stuff is going to get queered, painting isn’t dead after all, and there’s going to be a hell of a lot of viewing things through the lenses of other things.”

Art History entered the spotlight on a national level very specifically a few weeks ago, when President Barack Obama, speaking at General Electric’s Waukesha Gas Engines, said to the audience that “folks can make a lot more potentially with skilled manufacturing or the trades than they might with an art history degree…Now, nothing wrong with an art history degree — I love art history. So I don’t want to get a bunch of emails from everybody. I’m just saying, you can make a really good living and have a great career without getting a four-year college education, as long as you get the skills and training that you need.” The audience chuckled along, and applauded at the end. But not everybody was amused. While there is no evidence that America’s art history majors are going to start abandoning Obama in droves, he did manage to draw some backlash from the College Art Association’s director Linda Downs, who issued the following statement in response:

The College Art Association has great respect for President Obama’s initiative to provide all qualified students with an education that can lead to gainful employment. We support all measures that he, Congress, State Legislatures, and colleges and universities can do to increase the opportunities for higher education.

However, when these measures are made by cutting back on, denigrating, or eliminating humanities disciplines such as art history, then America’s future generations will be discouraged from taking advantage of the values, critical and decisive thinking, and creative problem solving offered by the humanities. It is worth remembering that many of the nation’s most important innovators, in fields including high technology, business, and even military service, have degrees in the humanities.

Humanities graduates play leading roles in corporations, engineering, international relations, government, and many other fields where skills and creating thinking play a critical role. Let’s not forget that education across a broad spectrum is essential to develop the skills and imagination that will enable future generations to create and take advantage of new jobs and employment opportunities of all sorts. (http://www.mediaite.com/tv/watch-obama-slights-art-history-majors/)

It’s no surprise that the organization defends its own. But Obama’s remarks have some chilling implications far beyond the validity of an art history degree. Would Obama want his own children to go to a trade school to become skilled in a blue collar trade? Or is class segregation acceptable, with one definition of success for some, and another for others? The idea that an education in the humanities is a luxury implies…comedian Louis C.K. said it very well. Talking about Technical High School, he said, “That’s where dreams are narrowed down. We tell our children you can do anything you want, their whole lives. You can do anything. But at this place, we take kids that are like fifteen years old, they’re young, and we tell them, ‘You can do eight things.’”

Maybe in some communities this beats the alternative. Sure, being a welder beats being a drug dealer. (Well…I know some drug dealers who would disagree. Oh, don’t give me that look. That ‘friend’ you buy your weed and coke from is a drug dealer. But I mean, on the street level, it’s pretty high risk.) But it’s totally antithetical to our ideals of hope, ambition, social mobility, and whatever is left of the American Dream, if that was ever really a thing.

John Adams said, according to Fred Shapiro’s The Yale Book of Quotations), “I must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy. My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce, and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry, and Porcelaine.”

I’ve frequently heard this quotation used to argue, broadly, that times of scarcity or hardship are not the time to study the humanities. The quotation comes from a letter John Adams wrote to his wife Abigail Adams…on May 12, 1780. Over 230 years ago. Do the math. Okay, I’ll help:

John and Abigail had six children, over a ten year span. Three were daughters, of whom one was stillborn and another died before her second birthday. A third daughter lived long enough to give birth to four children, none of whom seem to have accomplished enough to merit a Wikipedia entry. John and Abigail also had three sons. Charles studied law before dying of alcoholism at the age of 30. Thomas also studied law (though apparently without much success), also struggled with alcoholism, and died deeply in debt (after fathering seven children). It’s hard to imagine John and Abigail even being able to claim with a straight face that they didn’t have a favorite child in John Quincy Adams. Instead of math and philosophy, he studied classics and practiced law before going into politics like his father.

John Quincy Adams and his wife Louisa had three sons (and a daughter, who were still pretty much treated as footnotes back then). Their first two, George and John, were trainwrecks on the level of their uncles Charles and Thomas, dying (one of suicide) in early adulthood. Their third, also named Charles, did somewhat better, carrying on the family tradition of diplomacy and politics. A fine pursuit, certainly making his father proud, but not the study of “Painting, Poetry, Musick, Architecture, Statuary, Tapestry, and Porcelaine” which the original John Adams had said he envisioned for his own grandchildren. (In turn, Charles Francis Adams, with Abigail Brown Brooks, fathered seven children, none of whom, so far as I could find, turned out to be painters, poets, musicians, or anything of the kind.)

The first John Adams was a soldier so that his children could be scientists and his grandchildren could be artists. But none of them were. They were all diplomats, military officers, lawyers, and politicians. I don’t know who their descendents today are. Google it if you’re curious. But I doubt there are many blue collar workers among them. Wealth is, after all, inherited, unless it’s squandered by some suicidal alcoholic like some of the Adams kids. I wonder, though, whether, twelve generations later, any of John Adams’ great-great-great-great-great-great-great-great-great-great-grandchildren are painters, poets, musicians, architects, sculptors, weavers, or ceramicists. And I wonder what he would say to hear our President essentially tell today’s parents (well, the poor ones) that they shouldn’t share the dream he had for his own descendants.

Disney’s Princesses Have Grown Up…A Little

February 3, 2014 · Print This Article

Warning: In this piece I talk about movies. I’m not sure what it has to do with art. Also, if you haven’t seen the Disney films Brave and Frozen, and you care about knowing what happens in them, you might go watch them before reading this.

Taking a look at American popular culture, originality looks to be on the decline. We live in the age of the remake, the cover, the mashup. Doesn’t a lot of new music sound like shitty covers of old music? (Or perhaps we’re just getting old; does every generation live its whole life thinking music hit its zenith when they themselves were teenagers, a cycle of criticism that repeats itself with each new generation?)

The problem appears most acute in cinema, and I’m not talking here about independent or foreign film, but in mainstream Hollywood. Not film, but movies. “Reboot” has become a household word in an entirely different context than restarting a computer; a series of movies is now a “franchise.” Star Trek and Spiderman have run through enough sequels that they just started over again at the beginning. Total Recall, Judge Dredd, and now Robocop have been subjected to entirely unnecessary (though in the case of Judge Dredd, interesting; Total Recall not so much) remakes. And even the “new” movies are just combinations of the old: Vampire Academy might as well be titled Twilight Goes To Hogwarts. The Legend of Hercules looks like 300 meets Gladiator, and while that sounds awesome, it’s not. Not at all.

I have been pleasantly surprised, then, to find some original storytelling in an unexpected place: Disney princess movies. I know, I know. I’m as skeptical of Der Maus as the rest of you, and deeply appreciated the humor (with a rich undercurrent of biting satire) in the Charnel House’s recent in-house production of…take a minute to appreciate this title…They Saved Hitler’s Brain…And Put It In Walt Disney. Hilarious play, so perfect. Performance was excellent. And when a company has such a stranglehold on a genre, when fairy tales have become synonymous with the company’s animated version and the originals, compiled from folk legends (mostly German) by the brothers Grimm, almost totally forgotten…Disney is an easy company to hate.

In its princesses, particularly, Disney has a long history of perpetuating harmful stereotypes, and standards of beauty, in this movies (and tie-in merchandise) marketed to young girls. Ariel looks like you could snap her in half at the waist. Jasmine…I’ve never asked a Middle Eastern woman what they think of her, but I can imagine it’s similar to how some ethnic Persians responded to seeing their race depicted in 300. Overall, the characters have been overly frail, meek, and utterly dependent on the male characters with whom they were besotted. Romantic love, we are told, is the woman’s…well, then adolescent girl’s, sole reason for existence. (The depictions have generally given us the idea that anyone who isn’t married by seventeen is an old maid.)

I’m making broad generalizations here, and to be sure, there are exceptions. In fact, I make these generalizations specifically to call attention to a couple of these exceptions. While still perhaps imperfect, the last two Disney princesses (that I’ve seen) have been markedly better role models.

The first was Merida, from 2012’s Brave. A female co-director (Brenda Chapman) may have played some role in the film’s treatment of its heroine, whose development included a lot of work on her relationship with her mother. The usual plot, of beautiful (basically skinny) princess meets handsome (muscular with a jaw like the 1998 version of Godzilla: http://www.imdb.com/title/tt0120685/) prince, is totally absent. In fact, while the common trope of an undesired-by-the-princess arranged betrothal is, as is often the case, the starting point of the film, Merida rejects the idea not in favor of a preferable relationship (usually based on superficial attractiveness) but rather to live her own independent life. Of all the Disney princesses, Merida was the first with whom I could really identify: strong, independent, a believable young woman, and with a more realistic body type than the usual sequined Barbie doll…at least until Disney fucked it up by tarting her up like JonBenét Ramsey (http://www.huffingtonpost.com/2013/05/08/merida-brave-makeover_n_3238223.html).

More recently, Frozen (still in theaters as of this writing) took an even more subversive twist on the usual princess-meets-prince story. I’ll warn you again, this plot has some twists and turns, and I’m about to discuss them, so if you haven’t seen it, and would rather not hear what happens, turn back now. While Brave was essentially a mother-daughter story, about a girl who wasn’t ready to settle down yet, Frozen was more of a sister story. And, while the protagonist of Brave wasn’t ready for a relationship, the princess in Frozen (like many young women) was all too eager to settle down.

There are actually two princesses in Frozen: the older, Elsa, who has crazy ice-magic, and the younger, Anna. The movie is essentially a story of the two sisters growing apart, and then the younger sister falling in love, and then everything going to shit. But a few interesting things happen along the way. The first is, when Anna announces that she’s in love, Elsa says what is perhaps the smartest thing any Disney princess has ever said: “You can’t marry someone you just met.” Fucking A. And what’s more, and here’s the spoiler, Elsa’s not just being an unromantic bitch here. She’s absolutely right. The dude, Hans, while apparently quite handsome and charming (the picture of a Disney prince), he turns out to be a scheming, murderous prick. Along the way, Anna meets a rough-around-the-edges type, Kristoff, who seems perfectly placed to take Hans’ place as Anna’s beloved. But that’s not quite how it plays out. It’s complicated, but basically the endgame is that the two sisters’ love for each other wins out, and romantic love takes a back seat. I was disappointed, of course, that the movie didn’t end with Hans killing Anna and then Elsa flipping her shit in a Carrie-like rage, impaling everyone present on giant stabby icicles of blood, but then…there’s a reason I don’t write for Disney.

Like Brave, Frozen is ultimately a feel-good kids movie, the kind of nepenthe parents administer to shut the kids up for an hour and a half, but that’s inherent to the medium. As kid-fodder go, Brave and Frozen are better than most of their predecessors. Is there a greater lesson here, for those of us outside the field of making animated films for children? Hell, I don’t know. But I’ll say this: Frozen gets a hell of a lot better once it’s been run through the creative filter of the Internet, which has already yielded two excellent spinoffs: the movie’s “hit single,” Let It Go, being performed in a plethora of languages (http://www.youtube.com/watch?v=ALUVJ_tyQ-E), beating Coke’s Superbowl commercial to the punch, and clips from the film rendered hilarious through the unnecessary censorship of innocuous lines of dialog (http://www.youtube.com/watch?v=q0v7rFSUrGE).