This is a guest post by Navarre Megali, a filmmaking student at UNCSA and North Carolina transplant by way of California, where he enjoys a saturation of craft beer and the use of the word “barbecue” as a noun.
While it isn’t uncommon these days to have “auteur” filmmakers behind big Hollywood franchises, it was practically unheard of in the 90’s, and therefore quite a strange circumstance to find Brian De Palma helming a big splashy action/adventure vehicle for Tom Cruise in 1996. Both Cruise and De Palma at this point were known names in the industry, but their pairing–both then and now–seems to be something of an anomaly. Indeed, the crafting of the first Mission: Impossible and its subsequent success seems to be a sort of happy coalescence of talent and timing.
This is a guest post by Dr. Josh Smicker, a Lecturer in the Department of Communications at the University of North Carolina at Chapel Hill. His work focuses on the intersection of new media technologies and new forms of militarization, and is currently working on a book exploring the history of discourses of “resilience” in the U.S.
In 1996, the year of Mission: Impossible’s release, the Spanish sociologist Manuel Castells published the first volume of his magnum opus The Information Age: Economy, Society and Culture. It was his attempt to make sense of the sweeping transformations and transitions taking place throughout the 90s, and specifically to link up technological changes to the many geopolitical and ideological reconfigurations of the era. The second volume, 1997’s The Power of Identity elaborated (at length) a dialectic that is now mostly a commonplace—that the combination of the political, economic, and technological changes taking place throughout the 1990s allowed redefinitions of older identities and the emergence of totally new identity formations. However, these shifting or emergent identities also provoked backlash and resistance, and are therefore paralleled by a resurgence of nationalist and fundamentalist movements. The question of what tendency might prevail is largely left open, with Castells more interested in documenting the ways that the “process of techno-economic globalization shaping our world is being challenged, and will eventually be transformed, from a multiplicity of sources, according to different cultures, histories, and geographies.”
In 2004, in an impossibly ornate room in a Castle just outside of Sussex, England, the International Association of Scientologists awarded Tom Cruise with its first and only Freedom Medal of Valor.
After praising Cruise for several minutes, Scientology leader David Miscavige cues a video montage celebrating him. In a voice out of a movie trailer, a narrator exclaims that “every move” Cruise makes amounts to “countless impressions,” that he is one of a “rare few in history” with his level of influence, before cutting to clips of Jay Leno, Oprah, Barbara Walters, and Ellen, introducing Tom Cruise to their audiences, often as “the biggest movie star in the world.” Lights flash, cuing rapid fire clips of Cruise walking red carpets, waving to fans, talking to photographers.
Eventually, perhaps inevitably, the Mission: Impossible theme begins. Cruise appears dressed in a black turtleneck and speaks passionately and at length about Scientology, while the iconic theme song continues to play. We are seeing, as the narrator puts it, “Tom Cruise on Tom Cruise Scientologist.”
Film soundtracks were at their best in the 1990s. It was an important genre in the last pre-download era, because it provided space for down-list acts. And in the 1990s, record companies’ down-list acts, at various points, included Nick Cave and the Bad Seeds, Mazzy Star, Bjork, Pulp, the Flaming Lips, the Cardigans, and other art rock favorites that couldn’t rely on radio play to move units. (Plus lots of great hip-hop and R&B but like an idiot I tuned that out until Stankonia opened my eyes.) Headline acts used soundtracks to stretch out their muscles, trying out new ideas, performing covers — there must’ve been five hundred versions of Harry Nilsson’s “One (Is the Loneliest Number)” on 1990s OSTs — and using up their album leftovers. By the end of the decade soundtracks were stuffed with terrible nu-metal, rap rock, and Smashmouth, but for a few weird years soundtracks were one of the best ways to become introduced to cool new music.
Twenty years later, we talk Ethan Hunt as the American anti-Bond, the oddness of Brian De Palma helming a blockbuster franchise, and how M:I changed Tom Cruise’s career.
Josh Smicker: I have a few questions/comments I’d like to toss to the group.
So, the IMF is a specialized subsection of the CIA? Is that actually directly stated (rather than strongly implied, as in the NOC list, etc.)? Because I do think implication v. definition matters quite a bit here. If it is, does it stay that way in the future films? My memories of it are more of a Rainbow Six/SHIELD non-governmental group, but my memories of MI movies are pretty non-specific.
After the “botched” first mission, how long does Tom Cruise stay in their HQ? Are we to think that the second IMF group, specifically sent their for a mole hunt mission, is unaware where this HQ is, and/or is totally inept at tracking communications from it? More generally, the film is at such an interesting inflection point in media technologies/infrastructures/cultures (and about a bunch of incipient digital technologies literally framed by the analog; apparently it was the last major studio release on Beta, too). I found the representations of hacking/the Internet to be super-hilarious even given the context (e.g. typing “max.com” into the usenet to look for Max; the “jam all signals now” command on the train).
Given the themes of the film, especially around identity, I think De Palma makes a lot of sense as the director.
That’s it for now. I’m curious both about response to any of those, and also people’s general reactions upon returning to the film…
In a recent post I suggested that one could argue that the entire increase in per capita income over the past 50 years was pure inflation (and hence that real GDP per capita didn’t rise at all.) But also that one could equally well argue that there has been no inflation over the past 50 years. The official government figures show real GDP/person rising slightly more than 150% since 1964, whereas the PCE deflator is up about 6-fold. …
Here’s one thought experiment. Get a department store catalog from today, and compare it to a catalog from 1964. (I recently saw Don Boudreaux do something similar at a conference.) Almost any millennial would rather shop out of the modern catalog, even with the same nominal amount of money to spend. Of course that’s just goods; there is also services, which have risen much faster in price. OK, so ask a millennial whether they’d rather live today on $100,000/year, or back in 1964 with the same nominal income. Recall the rotary phones and bulky cameras. The cars that rusted out frequently. Cars that you couldn’t count on to start on a cold morning. I recall getting cavities filled in 1964, without Novocaine. Not fun. No internet. Crappy TVs, where you have to constantly move the rabbit ears on top to get a decent picture. Lame black and white sitcoms, with 3 channels to choose from. Shorter life expectancy, even for the affluent. No Thai restaurants, sushi places or Starbucks. It’s steak and potatoes. Now against all that is the fact that someone making $100,000/year in 1964 was pretty rich, so your social standing was much higher than that income today. So it’s a close call, maybe living standards have risen for people making $100,000/year, maybe not. Zero inflation in the past 50 years may not be right, but it’s a reasonable estimate for a millennial, grounded in utility theory. In which period does $100,000 buy more happiness? We don’t know.
I think if we really don’t know the answer to this question then it’s only because happiness is subjective. To me it’s obvious that a $100,000/year salary is worth more today than it used to be. For one thing, in 1964 tax rates in basically every Western economy were absurdly high, so that that $100,000 would really be somewhere from $10,000-30,000. George Harrison wasn’t exaggerating; how would you like to live in a country where your best artists and creators were forced into (or simply chose) tax exile?
But let’s leave that aside for now. In 1964 a $100,000 salary would make you an elite, but your real income would actually be much smaller than that because of all of the 2014 goods you could not purchase at any price. Sumner runs many of them down, but the point is that $100,000 is still enough to live quite well in this country — even in the expensive cities — but the range of choice has exploded, and many of the modern choices now come at very low cost.
Let’s not forget that politics was quite different in 1964 as well: segregation persisted, the Cold War was raging, and even in the U.S. the “elite” were defined as much by their pedigree as income. We weren’t far removed from McCarthy, and were in the midst of a succession of assassinations of American political leaders and overt revolutionary threats in many Western societies. No birth control, no abortion, few rights for women and homosexuals in general. Being an elite in that world would likely feel very uncomfortable, and of course this blog (and essentially all media I consume) wouldn’t exist. So for me 2014 is the obvious choice.
But here’s the catch: would you rather have net nominal 20k today or in 1964? I would opt for 1964, where you would be quite prosperous and could track the career of Miles Davis and hear the Horowitz comeback concert at Carnegie Hall. (To push along the scale a bit, $5 nominal in 1964 is clearly worth much more than $5 today nominal. Back then you might eat the world’s best piece of fish for that much.)
I’m still not sure. $20k/year back then wouldn’t be enough to make you very well off, and the marginal cost of culture consumption today has sunk almost to zero. Was Miles Davis really so much better than anyone working today? For everyone in the world who does not live in NYC, is it better to be able to watch his concerts on YouTube now, and on demand, than not to have seen them at all? Lenny Bruce was still active in 1964 but almost no one ever saw him (for both technological and political reasons). I might still take the $20k today, and I’ve lived on less than that for my entire adult life until last year, so this is an informed choice. But I agree that it’s a much more difficult decision.
It is an interesting question, mostly because it reveals what people value most. It’s a mutation of the “veil of ignorance”. So what would you choose?
After reading hundreds of articles about the trials, appeals and executions of criminals for my research assistantship I’ve become depressingly familiar with the tradition of reporting on an individual’s last meal. In the US, most states offer individuals on death row the opportunity to choose their last meal. The details of these requests appear in almost every article covering an execution, sometimes incorporated into the article, and surprisingly often as an afterthought, “He was pronounced dead at 12:17 am, following 15 years of appeals and an unwavering assertion of his innocence. In his final words he expressed his love and gratitude to his family. Oh, and his last meal was pecan pie.” I knew this practice existed, I’ve seen it in news coverage before, but reading mentions of last meals back to back to back was different. It made me realize just how weird, and contradictory and depressing the practice is.
Brent Cunningham has a great essay on last meals in which (among many other things) he traces the tradition back to ancient Greece and Rome, specifically to Roman gladiators who were fed lavish meals before their day in the Coliseum. The public obsession with last meals is much more recent, and probably stems from the shift away from public executions in the US – which has left the public with less opportunity to view executions but no less interest in them. And the media are well aware of this interest. CBS News coverage of last meals describes them pretty accurately as “an enduring, if morbid, source of fascination.” The Huffington Post, covering a website dedicated to last meals, describes them as “fascinating yet creepy.”
Blogs and crime tv website coverage of last meals trends towards morbid curiosity and frivolity. TruTv’s slideshow features mugshots and below, urges viewers to “also check out: hot celebs pretending to eat.” Headline News’ gallery is titled “Gatorade to Lobsters: Serial Killer’s Last Meals” and more disturbingly, features a smug Nancy Grace staring out from the page banner.
There is also work on last meals that is reverent and striking, including Celia A. Shapiro and Mat Collishaw‘s photo essays featured in MotherJones and Time respectively. Recognizing the power of the idea (and images) of the last meal, Amnesty International recently commissioned artist James Reynolds to recreate the last meals of men who were later proven innocent. The meals featured in an anti-death penalty campaign alongside the dates individuals were executed and presumed or proven innocent.
The Last Last Meal
In 2011, Texas, the state with by far the highest number of executions, ended this tradition following the execution of a man who did not eat any of the enormous meal he had requested (it included over ten items, one of which was a pound of bbq). Notably, the inmate in question was Lawrence Brewer, a white supremacist sentenced to death for the gruesome, racially motivated murder of James Byrd Jr. – a murder which motivated the passage of a Texas hate crime law and the Federal Hate Crimes Prevention Act. Not surprisingly, Brewers final act outraged many, including State Senator John Whitmire, who called on the executive director of the Texas prison agency to end the practice of last meals. Within hours, the prison agency’s executive director had terminated the policy, effective immediately. The New York Times spoke to Whitmire about his opposition, which he said had little to do with cost and state budgets:
“He never gave his victim an opportunity for a last meal…Why in the world are you going to treat him like a celebrity two hours before you execute him? It’s wrong to treat a vicious murderer in this fashion. Let him eat the same meal on the chow line as the others.”
Whitmire was right not to worry about cost, since last meals are rarely as extravagant as they seem. In fact, the last meals published are generally what is requested, not what prisoners actually get. In most states there are limitations on what can be provided. In Florida, last meals can cost no more than $40 and all ingredients must be local. California provides last meals costing up to $50 and Oklahoma (the state with the third most executions) budgets just $15 for last meal provisions. Following the change in Texas policy, Timothy Williams of the Times interviewed a Brian D. Price, a former Texas death row chef who description of his efforts to fulfill last meal wishes is worth quoting in full:
“The Texas Department of Corrections has a policy that no matter what the request, it has to be prepared from items that’s in the prison kitchen commissary. And, like if they requested lobster, they’d get a piece of frozen pollock. Just like they would normally get on a Friday, but what I’d do is wash the breading off, cut it diagonally and dip it in a batter so that it looked something like at Long John Silver’s — something from the free world, something they thought they were getting, but it wasn’t. They quit serving steaks in 1994, so whenever anyone would request a steak, I would do a hamburger steak with brown gravy and grilled onions, you know, stuff like that. The press would get it as they requested it, but I would get their handwritten last meal request about three days ahead of time and I’d take it to my captain and say, “Well, what do you want me to do?” And she’d lay it out for me. I tried to do the best I could with what I had. Amazingly, we did pretty well with what we did have. They are served two hours before they are executed and it is no longer a burger and fries or a bacon, lettuce and tomato sandwich or whatever they requested. All it is, two hours later, is stomach content on an autopsy report.”
As Price’s experience suggests, the tradition of the last meal is often misrepresented and is inherently counter intuitive. The “choice” of steak or lobster in reality amounts to a choice of reimagined prison staples. And two hours later the privilege of a personalized and (we imagine) comforting last meal is “stomach content on an autopsy report.” (more…)