Dave Brockie, better known as Oderus Urungus, frontman of the band GWAR, passed away on March 23rd. He was found dead in his Richmond home by a fellow band member. As of this writing, murder and suicide have been ruled out as causes of Brockie’s death,while drugs are still being considered a possibility. Drugs seem likely. Drugs featured prominently in the band’s lyrics (which may not be significant, considering that necrophilia, bestiality, and mass murder were common themes as well) and in Brockie’s autobiographical writing as well. According to police, there was evidence of drug use at the scene. While the official autopsy report is yet to be released, it seems probably that Brockie died of a drug overdose.
Much is often made, in the wake of a celebrity’s death, and especially a premature death to drugs or suicide, of what lesson we might learn, of the pressures of fame, the ills of society, and so on. We are asked what lesson we might learn, and also (often as we are being asked for a contribution to a foundation) what the celebrity would have wanted. Of course, on one level it’s irrelevant: the celebrity is dead, and so their wishes are irrelevant. Funerals are for the living. I never knew Dave personally, but if you ask me what lesson he’d want us to learn from his death, I’d say, “Not a damned thing.” He’d want us to steal his corpse from the medical examiner’s office and have sex with it.
GWAR was started in the 1980s by a group of art students at Virginia Commonwealth University. Hunter Jackson was a VCU student working on a film called Scumdogs of the Universe (later to be used as the title of GWAR’s second album). Brockie was the singer for a punk band called Death Piggy. Jackson (better known to GWAR fans as Techno Destructo) was using an old warehouse to film his movie; Death Piggy rehearsed in the same warehouse. The two got to know each other, and GWAR was born. (Sort of. As is generally the case, the truth is a lot more complicated, but that’s the short version.
I haven’t been able to confirm whether or not Dave Brockie was himself enrolled at VCU, but many of the founding memers of GWAR were, including Jackson, and Chuck Varga (who performs in GWAR as Sexecutioner). In a 1994 interview with Live Wire Magazine, Varga talked about leaving the fine art path to join GWAR: “I went to college, I went the fine art route, and it really turned me off. I was really creative, but at the same time, I wasn’t into fine or commercial art. It seemed like art was really a dead end thing to get into. I was hanging around with Hunter (Jackson, Techno Destructo when he’s around, “a lowly slave” when he’s not) Dave (Brockie, Oderus Urungus, the vocalist), who were totally crazy, much like myself. They totally reviled in comic books and movies, and II kind of looked at myself and said, ‘I’ve always been into that! I don’t need a bunch of goddamned museum bullshit!’ So I had a rebirth in a way, forget everything I learned in college, and I started to learn about a totally different science of special effects and props.” (http://spookykids.net/gwar/gwarpage/Unmasked.html)
More than any cautionary tale about drugs and the stereotypes of the rock and roll lifestyle, the lesson to take away from Dave Brockie’s death is to look at his life, and the lives of his bandmates, past and present, living and dead. A nineteen year old punk singer from Canada, Brockie met some art students who were tried of trying to make it in what by 1985 they were already seeing as an overly repressive and stagnated art world. Though they would probably have simultaneously shat and vomited at the language, what they did next was a finer piece of interdisciplinary, collaborative, relational aesthetics than most projects to be so called. They presaged the rough aesthetics of Nathalie Djurburg (http://www.lissongallery.com/artists/nathalie-djurberg-hans-berg/gallery) and the wet, sticky grotesque of Gregory Jacobsen (http://gregoryjacobsen.com/). Under the rotted surface, their work contained a subtle and no-one-is-safe political satire, like an X-rated version of Vermont’s Bread and Puppet. And it all started when a punk singer and some art students decided that instead of banging their heads against the ceilings in their respective fields, they’d strap on some big rubber dicks and go for broke.
There is an episode of Star Trek: The Next Generation in which Lt. Cmdr. Data expresses to the rest of the crew his puzzlement at the human fascination with â€œold things.â€ The crew were probably trying to save some ancient ruins or encountering a relic from the past (probably a shoutout to the original series, like the wreck of the old Enterprise or something). It is, if you think about it, an odd notion. Why is something made a thousand years more interesting than something made yesterday? (With the penchant for clever, punny titles of panel sessions at CAA, if there hasnâ€™t yet been, there will almost certainly eventually be, an art history panel called â€œLascaux to Last Week,â€ probably about contemporary cave paintings or appropriating ancient imagery.) [Note: Apparently it’s a book. I thought I’d heard that somewhere. http://www.percontra.net/archive/3lascauxtolastweek.htm]
Art History has had a couple of moments in the spotlight recently. The College Art Association conference just took place in Chicago, and for those in studio art fields who attend, itâ€™s maybe more exposure to art history than we get, unless we actively seek it out, during the rest of the year. (The conference has a history of some animosity between the two disciplines; from what Iâ€™ve gathered it was more art history focused in the past, and in recent years studio art has been taking over, affecting everything from the book and trade fair to the location of the conference itself.)
The CAA conference isnâ€™t universally loved, or even respected, by visual artists. My friend and colleague, painter Steve Amos, posted to Facebook: â€œBeware of the foul smell emanating from the South Loop; the pile of bullshit known as the College Art Association conference is in town.â€ (Posted February 14th to Facebook: https://www.facebook.com/steveamos/posts/10151952963102919?stream_ref=10.)
I didnâ€™t ask Steve what he meant or why he felt that way, but Iâ€™ve heard the sentiment echoed among many of my friends, and may have said something along those lines myself, in a moment of frustration. Some of the hate may come from a frustration with the job market, and a treating of the conference as synonymous with the Career Services aspect thereof. The Interview Hall and Candidate Center are certainly geared towards job seekers. I know some people who have gotten jobs through interviews at CAA, and others who have gotten interviews. Personally, Iâ€™ve never been interviewed at CAA, though their career services have helped me in other ways: almost every job for which Iâ€™ve applied was listed on CAA (other listing sites include Higher Ed Jobs, The Chronicle of Higher Education, and Academic Keys), and their mock interviews and packet reviews helped me prepare for the application and interview process for my current position. (Since August of 2013 Iâ€™ve been teaching full time at Northern Arizona University.)
Another recent spotlight on art history was the film Monuments Men, in which some art experts get drafted into WWII to â€œtell our boys what they can and canâ€™t blow up.â€ It was a true story (an interview with one of the surviving, original Monuments Men was featured recently on NPR), and a lot of masterpieces in European collections survive today only because of these men. (Others, such as an Italian monastery, were bombed out of supposed military necessity.) My friend and colleague, Chicago artist Renee Prisble, asked on Facebook (via Twitter), â€œWhere were â€˜The Monuments Menâ€™ when we invaded Iraq?â€ (Posted to Facebook January 27th, via Twitter: https://www.facebook.com/reneeprisble/posts/10203102149818529?stream_ref=10.)
Itâ€™s a fair question, one that was asked plenty at the time (or, rather, immediately after the looting of the museum), although mostly among the NPR set (myself included). Thereâ€™s an image, I can still see it, of the facade of the museum sporting a hole created by a round from the cannon of a main battle tank. In this case the Americans clearly caused the damage by invading, even though it was primarily locals who did the looting (as opposed to the WWII example, in which invading Nazis themselves were the looters).
Two years earlier, just before 9/11, in the summer of 2001, the Taliban had used rockets and explosives to destroy the Baniyam Buddhas of Afghanistan, a resurgence of the age-old iconoclastic prohibition. Iconoclasm is based on Mosiac law (i.e. the Old Testament generally, and specifically the Ten Commandments), and thus is common to the history of Islam, Christianity, and Judaism, although within each faith sects vary widely in how literally they interpret this. Islamic Fundamentalism is among the most vehement, its leaders sometimes issuing death threats against people who depict Mohammed. The Taliban followed in this tradition when they chose to destroy the pair of 6th Century monumental sculptures of the Buddha, carved into a cliff face. (Mosaic law can be interpreted as instructing its followers not to make any representational imagery whatsoever, or more narrowly not to represent prophets and deities; in this case it was extended to destroying ancient monuments made my followers of another religion.)
The tragedy of this destruction is central to answering Dataâ€™s question: why was it such a big deal? Merely because the statues were old? Or because they were a symbol of a faith different than that of their destroyers, and we in the West have a live-and-let-live, relativist attitude? I donâ€™t have the answer to this, but certainly our fascination with old things, as well as our respect for other cultures, is central to the role of art history.
It would be disingenuous to treat art history as totally synonymous with preservation. Certainly conservation, preservation, and repatriation of lost or stolen works is a role that requires the asssistance of an art historian. But the bread and butter of art history is study and interpretation. I described it in my own prediction for what Iâ€™d see at the College Art Association conference: â€œA bunch of new stuff is going to get queered, painting isnâ€™t dead after all, and thereâ€™s going to be a hell of a lot of viewing things through the lenses of other things.â€
Art History entered the spotlight on a national level very specifically a few weeks ago, when President Barack Obama, speaking at General Electricâ€™s Waukesha Gas Engines, said to the audience that â€œfolks can make a lot more potentially with skilled manufacturing or the trades than they might with an art history degree…Now, nothing wrong with an art history degree â€” I love art history. So I donâ€™t want to get a bunch of emails from everybody. Iâ€™m just saying, you can make a really good living and have a great career without getting a four-year college education, as long as you get the skills and training that you need.â€ The audience chuckled along, and applauded at the end. But not everybody was amused. While there is no evidence that Americaâ€™s art history majors are going to start abandoning Obama in droves, he did manage to draw some backlash from the College Art Associationâ€™s director Linda Downs, who issued the following statement in response:
The College Art Association has great respect for President Obamaâ€™s initiative to provide all qualified students with an education that can lead to gainful employment. We support all measures that he, Congress, State Legislatures, and colleges and universities can do to increase the opportunities for higher education.
However, when these measures are made by cutting back on, denigrating, or eliminating humanities disciplines such as art history, then Americaâ€™s future generations will be discouraged from taking advantage of the values, critical and decisive thinking, and creative problem solving offered by the humanities. It is worth remembering that many of the nationâ€™s most important innovators, in fields including high technology, business, and even military service, have degrees in the humanities.
Humanities graduates play leading roles in corporations, engineering, international relations, government, and many other fields where skills and creating thinking play a critical role. Letâ€™s not forget that education across a broad spectrum is essential to develop the skills and imagination that will enable future generations to create and take advantage of new jobs and employment opportunities of all sorts. (http://www.mediaite.com/tv/watch-obama-slights-art-history-majors/)
Itâ€™s no surprise that the organization defends its own. But Obamaâ€™s remarks have some chilling implications far beyond the validity of an art history degree. Would Obama want his own children to go to a trade school to become skilled in a blue collar trade? Or is class segregation acceptable, with one definition of success for some, and another for others? The idea that an education in the humanities is a luxury implies…comedian Louis C.K. said it very well. Talking about Technical High School, he said, â€œThatâ€™s where dreams are narrowed down. We tell our children you can do anything you want, their whole lives. You can do anything. But at this place, we take kids that are like fifteen years old, theyâ€™re young, and we tell them, â€˜You can do eight things.â€™â€
Maybe in some communities this beats the alternative. Sure, being a welder beats being a drug dealer. (Well…I know some drug dealers who would disagree. Oh, donâ€™t give me that look. That â€˜friendâ€™ you buy your weed and coke from is a drug dealer. But I mean, on the street level, itâ€™s pretty high risk.) But itâ€™s totally antithetical to our ideals of hope, ambition, social mobility, and whatever is left of the American Dream, if that was ever really a thing.
John Adams said, according to Fred Shapiroâ€™s The Yale Book of Quotations), â€œI must study Politicks and War that my sons may have liberty to study Mathematicks and Philosophy. My sons ought to study Mathematicks and Philosophy, Geography, natural History, Naval Architecture, navigation, Commerce, and Agriculture, in order to give their Children a right to study Painting, Poetry, Musick, Architecture, Statuary, Tapestry, and Porcelaine.â€
Iâ€™ve frequently heard this quotation used to argue, broadly, that times of scarcity or hardship are not the time to study the humanities. The quotation comes from a letter John Adams wrote to his wife Abigail Adams…on May 12, 1780. Over 230 years ago. Do the math. Okay, Iâ€™ll help:
John and Abigail had six children, over a ten year span. Three were daughters, of whom one was stillborn and another died before her second birthday. A third daughter lived long enough to give birth to four children, none of whom seem to have accomplished enough to merit a Wikipedia entry. John and Abigail also had three sons. Charles studied law before dying of alcoholism at the age of 30. Thomas also studied law (though apparently without much success), also struggled with alcoholism, and died deeply in debt (after fathering seven children). Itâ€™s hard to imagine John and Abigail even being able to claim with a straight face that they didnâ€™t have a favorite child in John Quincy Adams. Instead of math and philosophy, he studied classics and practiced law before going into politics like his father.
John Quincy Adams and his wife Louisa had three sons (and a daughter, who were still pretty much treated as footnotes back then). Their first two, George and John, were trainwrecks on the level of their uncles Charles and Thomas, dying (one of suicide) in early adulthood. Their third, also named Charles, did somewhat better, carrying on the family tradition of diplomacy and politics. A fine pursuit, certainly making his father proud, but not the study of â€œPainting, Poetry, Musick, Architecture, Statuary, Tapestry, and Porcelaineâ€ which the original John Adams had said he envisioned for his own grandchildren. (In turn, Charles Francis Adams, with Abigail Brown Brooks, fathered seven children, none of whom, so far as I could find, turned out to be painters, poets, musicians, or anything of the kind.)
The first John Adams was a soldier so that his children could be scientists and his grandchildren could be artists. But none of them were. They were all diplomats, military officers, lawyers, and politicians. I donâ€™t know who their descendents today are. Google it if youâ€™re curious. But I doubt there are many blue collar workers among them. Wealth is, after all, inherited, unless itâ€™s squandered by some suicidal alcoholic like some of the Adams kids. I wonder, though, whether, twelve generations later, any of John Adamsâ€™ great-great-great-great-great-great-great-great-great-great-grandchildren are painters, poets, musicians, architects, sculptors, weavers, or ceramicists. And I wonder what he would say to hear our President essentially tell todayâ€™s parents (well, the poor ones) that they shouldnâ€™t share the dream he had for his own descendants.
Warning: In this piece I talk about movies. I’m not sure what it has to do with art. Also, if you haven’t seen the Disney films Brave and Frozen, and you care about knowing what happens in them, you might go watch them before reading this.
Taking a look at American popular culture, originality looks to be on the decline. We live in the age of the remake, the cover, the mashup. Doesn’t a lot of new music sound like shitty covers of old music? (Or perhaps we’re just getting old; does every generation live its whole life thinking music hit its zenith when they themselves were teenagers, a cycle of criticism that repeats itself with each new generation?)
The problem appears most acute in cinema, and I’m not talking here about independent or foreign film, but in mainstream Hollywood. Not film, but movies. â€œRebootâ€ has become a household word in an entirely different context than restarting a computer; a series of movies is now a â€œfranchise.â€ Star Trek and Spiderman have run through enough sequels that they just started over again at the beginning. Total Recall, Judge Dredd, and now Robocop have been subjected to entirely unnecessary (though in the case of Judge Dredd, interesting; Total Recall not so much) remakes. And even the â€œnewâ€ movies are just combinations of the old: Vampire Academy might as well be titled Twilight Goes To Hogwarts. The Legend of Hercules looks like 300 meets Gladiator, and while that sounds awesome, it’s not. Not at all.
I have been pleasantly surprised, then, to find some original storytelling in an unexpected place: Disney princess movies. I know, I know. I’m as skeptical of Der Maus as the rest of you, and deeply appreciated the humor (with a rich undercurrent of biting satire) in the Charnel House’s recent in-house production of…take a minute to appreciate this title…They Saved Hitler’s Brain…And Put It In Walt Disney. Hilarious play, so perfect. Performance was excellent. And when a company has such a stranglehold on a genre, when fairy tales have become synonymous with the company’s animated version and the originals, compiled from folk legends (mostly German) by the brothers Grimm, almost totally forgotten…Disney is an easy company to hate.
In its princesses, particularly, Disney has a long history of perpetuating harmful stereotypes, and standards of beauty, in this movies (and tie-in merchandise) marketed to young girls. Ariel looks like you could snap her in half at the waist. Jasmine…I’ve never asked a Middle Eastern woman what they think of her, but I can imagine it’s similar to how some ethnic Persians responded to seeing their race depicted in 300. Overall, the characters have been overly frail, meek, and utterly dependent on the male characters with whom they were besotted. Romantic love, we are told, is the woman’s…well, then adolescent girl’s, sole reason for existence. (The depictions have generally given us the idea that anyone who isn’t married by seventeen is an old maid.)
I’m making broad generalizations here, and to be sure, there are exceptions. In fact, I make these generalizations specifically to call attention to a couple of these exceptions. While still perhaps imperfect, the last two Disney princesses (that I’ve seen) have been markedly better role models.
The first was Merida, from 2012’s Brave. A female co-director (Brenda Chapman) may have played some role in the film’s treatment of its heroine, whose development included a lot of work on her relationship with her mother. The usual plot, of beautiful (basically skinny) princess meets handsome (muscular with a jaw like the 1998 version of Godzilla: http://www.imdb.com/title/tt0120685/) prince, is totally absent. In fact, while the common trope of an undesired-by-the-princess arranged betrothal is, as is often the case, the starting point of the film, Merida rejects the idea not in favor of a preferable relationship (usually based on superficial attractiveness) but rather to live her own independent life. Of all the Disney princesses, Merida was the first with whom I could really identify: strong, independent, a believable young woman, and with a more realistic body type than the usual sequined Barbie doll…at least until Disney fucked it up by tarting her up like JonBenÃ©t Ramsey (http://www.huffingtonpost.com/2013/05/08/merida-brave-makeover_n_3238223.html).
More recently, Frozen (still in theaters as of this writing) took an even more subversive twist on the usual princess-meets-prince story. I’ll warn you again, this plot has some twists and turns, and I’m about to discuss them, so if you haven’t seen it, and would rather not hear what happens, turn back now. While Brave was essentially a mother-daughter story, about a girl who wasn’t ready to settle down yet, Frozen was more of a sister story. And, while the protagonist of Brave wasn’t ready for a relationship, the princess in Frozen (like many young women) was all too eager to settle down.
There are actually two princesses in Frozen: the older, Elsa, who has crazy ice-magic, and the younger, Anna. The movie is essentially a story of the two sisters growing apart, and then the younger sister falling in love, and then everything going to shit. But a few interesting things happen along the way. The first is, when Anna announces that she’s in love, Elsa says what is perhaps the smartest thing any Disney princess has ever said: â€œYou can’t marry someone you just met.â€ Fucking A. And what’s more, and here’s the spoiler, Elsa’s not just being an unromantic bitch here. She’s absolutely right. The dude, Hans, while apparently quite handsome and charming (the picture of a Disney prince), he turns out to be a scheming, murderous prick. Along the way, Anna meets a rough-around-the-edges type, Kristoff, who seems perfectly placed to take Hans’ place as Anna’s beloved. But that’s not quite how it plays out. It’s complicated, but basically the endgame is that the two sisters’ love for each other wins out, and romantic love takes a back seat. I was disappointed, of course, that the movie didn’t end with Hans killing Anna and then Elsa flipping her shit in a Carrie-like rage, impaling everyone present on giant stabby icicles of blood, but then…there’s a reason I don’t write for Disney.
Like Brave, Frozen is ultimately a feel-good kids movie, the kind of nepenthe parents administer to shut the kids up for an hour and a half, but that’s inherent to the medium. As kid-fodder go, Brave and Frozen are better than most of their predecessors. Is there a greater lesson here, for those of us outside the field of making animated films for children? Hell, I don’t know. But I’ll say this: Frozen gets a hell of a lot better once it’s been run through the creative filter of the Internet, which has already yielded two excellent spinoffs: the movie’s â€œhit single,â€ Let It Go, being performed in a plethora of languages (http://www.youtube.com/watch?v=ALUVJ_tyQ-E), beating Coke’s Superbowl commercial to the punch, and clips from the film rendered hilarious through the unnecessary censorship of innocuous lines of dialog (http://www.youtube.com/watch?v=q0v7rFSUrGE).
Youâ€™ve only got a few more days to catch Artemesia Gentileschiâ€™s Judith Slaying Holofernes, on display through January 9th at the Art Institute (http://www.artic.edu/exhibition/violence-and-virtue-artemisia-gentileschi-s-judith-slaying-holofernes). The painting is not to be missed, on its own merits, but its content coupled with Gentileschiâ€™s biography also invites a broader discussion on artists who are also women. Iâ€™d like to think that this conversation is over, that the playing field is level and we can all just be artists regardless of what weâ€™ve got under our underwear, but reminders to the contrary are all to common: this month marks the one year anniversary of George Baselitzâ€™s unfortunate remark to Spiegel online that â€œwomen donâ€™t paint very well.â€
Of course pretty much everyone with a pulse derided Baselitz for his opinion, and Sarah Nardi wrote an excellent piece for the Chicago Reader pretty much excoriating Baselitz with a side-by-side comparison of his work with that of some female painters (http://www.chicagoreader.com/Bleader/archives/2013/02/05/women-cant-paint-and-neither-can-georg-baselitz). Baselitz is old news by now, but itâ€™s only a matter of time before someone else says something equally stupid in public, and weâ€™ll have to have this conversation all over again. We could save ourselves a lot of trouble if everybody would just go and take a look at Judith, because itâ€™s pretty much impossible to argue with.
One person I would really like to have had corner Baselitz in front of Gentileschiâ€™s painting would have been Grace Hartigan, the late painter and director of the Hoffberger School of Painting when I was a graduate student there. Grace was a female painter in the male-dominated Abstract Expressionist scene, and she certainly held her own with the boys. Graceâ€™s relationship with gender was a bit complicated; she once exhibited her work under the name George Hartigan. We asked her about it, but I never quite understood her reasons for doing that.
Hartigan once said something interesting about how for a long time she refused to participate in all-woman shows. Her reasoning was essentially that by participating in a show consisting entirely of women, she would have implied an acceptance that she couldnâ€™t compete with her male counterparts. She seemed to have softened her views before her death in 2009; her work was included in an all-female exhibition curated by Leslie King Hammond which I saw in New York sometime between 2005-2007. Iâ€™ve curated an all-female show, myself, and I believe they can have value: for example, when the work has something in common other than the genetalia of its makers. Nevertheless, her argument has stuck in my memory.
While from time to time, a group show of female artists can present something drawn from a commonality of experience they share, or a common concern, it should by now be clear that women need no handicap to stand on their own as painters, or artists in any medium, in Chicago or anywhere else. While for most of history women have been treated like a â€œminority,â€ albeit one comprising 51% of the population, and I think John Lennon had something to say about this, in todayâ€™s Chicago art scene women are well-represented in just about any role there is to be played.
It doesnâ€™t take any time at all to think of a female Chicago-based critic (Lori Waxman), gallerist (Linda Warren, Rhona Hoffman, Monique Meloche), or as we are all increasingly becoming, multi-role cultural facilitator (Michelle Grabner, Shannon Stratton, Claire Molek). Female artists, while Iâ€™m not going to do the math on what percentage of gallery rosters they form, certainly form at least half of my favorite artists in Chicago: Lauren Levato-Coyne, Jenny Kendler, and Deb Sokolow do amazing work; Noelle Mason, although sheâ€™s living and working in Florida now, cut her teeth in Chicago and still shows here.
If youâ€™ve been to at least a couple of shows in Chicago in the past year, youâ€™ve probably got your own favorite artists in mind, and odds are that more than a few are women. Some artists make work that isnâ€™t particularly gendered; it could as easily have been made by a man as by a woman. In other cases, though, artists draw on their own gender, and the unique experiences that come with it. This is true of male artists as well as female. A recent example was Chicago painter Julia Hawâ€™s â€œPussy Power,â€ from last year. Artemesia Gentileschiâ€™s â€œJudith Slaying Holofernesâ€ is another piece that draws its power from its creatorâ€™s gender. It is impossible to separate Gentileschiâ€™s biography from the image, especially when one compares it with treatments of the same subject by male painters (most notably Caravaggio). Its presence in Chicago is a rare opportunity to see one of the most important and powerful works of the Seventeenth Century, and there is no excuse not to see it. Wind chill temperatures that feel like fifty degrees below zero come close, but bundle up and make it out to the Art Institute in the next couple of days to see it before itâ€™s gone.
December 2, 2013 · Print This Article
I should say now that I have never been to Samarkand (in present-day Uzbekistan), and that my views of it have been shaped almost entirely by its mythical role in Clive Barker’s novel Galilee. A quick bit of slacker research, though, reveals the essential nature of that city to match Barker’s description pretty well. Situated on the Silk Road, Samarkand was a city of wonders, the ultimate crossroads, a center of commerce as well as of art and culture. People came from thousands of miles to experience the wonders of the city itself, but more so, to meet and trade with one another.
It sounds like the perfect sales pitch for globalization. What city wouldn’t want to model itself after old Samarkand? Open to all, a place where one can find anything, from anywhere, yet possessing its own unique character, its glories and wonders its own, Samarkand strikes in our imagination the perfect cross between melting pot and salad bowl.
Did Samarkand itself ever live up to this ideal? This is probably unknowable. The tendency to romanticize history is undeniable, and certainly our own cosmopolitan cities fall short of this utopia. Diversity is assimilated into a global monoculture which is then exported, and we end up feeding our client states the predigested remains of their own children. (Metaphorically speaking. For now.)
This cynical, CrimeThink version is also incomplete, of course. I’ve eaten Chinese food in Berlin and Ethiopian food in Baltimore. The first time I had a Big Mac was in Tokyo. I haven’t researched Taco Bell penetration in Mexico, because I’m afraid of what I’ll find, but I do remember walking past a bar in San Miguel de Allende and hearing a pretty badass cover of a Metallica song, the lyrics sung in Spanish. (I don’t remember what song but this was 1996, so most of the shitty ones hadn’t been released yet.). It is impossible not to think of William Gibson in these moments, and it has a surreal magic about it.
On the other hand, there is perhaps a danger in the ubiquity of the other. Is it a disincentive to travel, when so much of our destination has been brought to us on a plate? Does, in fact, this single-serving multiculturalism blend the rest of the world into the homogenously labeled “World Music” aisle of an obsolescent record store? (And reflect, if one goes into a music store in Beijing, does American pop go in the “World Music” aisle? Most likely not, and the reason is the problem. We have exoticized the others, even to themselves.)
Why travel, then, if anyone, anywhere, can buy a didgeridoo, a foo lion, and a Panang curry? “To see the place itself!” some argue, or “To meet the people!” And this is good, so long as it is remembered. So have fun in Miami, but remember, it’s just another art fair, unless you see the Everglades while you’re there.
Art fairs are a sort of microcosm of the Samarkand ideal in its imperfect manifestation, actually. I’ve written about them before as have many others, but never before in the shadow of the tents of the bazaars of Samarkand. Imagine! An art fair that stirred the senses with the sights and sounds and smells of the exotic! What Tony Fitzpatrick described in his play, of the grand market in Istanbul, a thousand guys chasing him down, shouting, “Pashminas!” And one guy shouting, “Tube socks!”
But we don’t get that, at least not at any art fair I’ve been to. (And to be fair, I need to make it to some international ones.) So far, what I’ve seen at American art fairs is pretty much the same roster of blue chip galleries selling to blue chip collectors, damn the locals, who cower in the shadows of the big boys. Exceptions, sure. I’ve seen great, unexpected work at art fairs. And some Chicago dealers have sold to out of town collectors at Art Chicago and at Expo. Local collectors do buy work (I have been on both ends of this transaction as an artist and as a small-time collector), but far too many of them are like the tourists visiting a Moroccan antiquities dealer I saw on Anthony Bourdain recently. “We call them penguins,” he said, waddling comically. “Their hands can’t reach their pockets.”
Homogeneity is the death of art. If a piece is expected, it’s pointless. Someone, I can’t now recall who, said, “If two artists are doing the same thing, one of them is unnecessary.” There is something to this. The old world of the Twentieth Century, the “Age of -isms,” decade-long proclamations of new world orders, each to be replaced by the next like the procession of coups in a string of Third World dictatorships, really ended with Pop Art. By the 1990s, Art History textbooks pointed to the future with a vague reference to pluralism and a prayer that wherever we were headed, Kenny Scharf wasn’t the one leading the way.
Pluralism, though, can become a homogeneity all its own. The art world embraces diversity not like Tamerlane (once the ruler of Samarkand) but like the Borg. “Your biological and technological uniqueness will be added to our own.” Less the great bazaar, and more a strip mall that had both a Taco Bell and a Panda Express. It is an arms race in which we each struggle to strip mine our culture and experience faster than our competition, and we find that global monoculture is a cloud with a lining not of silver but of Strontium 90.
So everybody knows the the fight is fixed, but what are you going too do about it? Revolution loses its luster once you’ve seen the sweatshops where they make the Guy Fawkes masks. And the obvious counterpoint to globalization, regionalism, has its own obvious failings. Living here in Flagstaff, Arizona, I see proof enough of that every day. Native crafts, particularly jewelry and ceramics, are strong here, but will always have to sit at the kids table of “fine craft,” that is when they aren’t called “outsider art.” Among the non-Natives, imitations of these styles run strong (as, it must be said, do very good and original creations in these traditional craft media). Photography? Sure, as long as it’s of a mountain. And God help you if you can’t sell a painting of a raven in this town.
I’ve been thinking a lot about the Hairy Who, the Monster Roster, and the Chicago Imagists. Chicago, I know, is sick to some extent of their legacy, if only because they dominated the local scene so heavily for so long. But these three related movements did something unique in their time, diverging both from the Modernist, Greenbergian Ab Ex that was the status quo at the beginning, as well as from the slick, clean Pop Art going on in New York. Chicago had, for a time, its own thing, as rare and exotic as a screeching monkey, an ivory carving, or a previously unheard of spice. This kind of regional movement with the teeth to hold its own on the global stage could emerge again, anywhere, in any city, any town, and if it did, might provide the kind of true diversity that could make possible a Silk Road of the art world, a bazaar of the unexpected, a new Samarkand.