Happy ever after

“The miracle of the cinema is how rarely the convention of the happy ending is broken. The bigger miracle is that the convention of the ending is never broken at all.” – Zadie Smith

I’m not sure where writer Zadie Smith (The Autograph Man) is going with the second part of her assertion. Surely it’s not all that surprising that most films we see tend to conclude rather than simply stop. Finish rather than break off mid-sentence, or mid-scene.

But she is certainly right in her contention that the majority of the films produced, particularly studio films, end happily ever after.

There’s even a name for it: A Hollywood ending. As opposed to, say, a Bollywood ending, where cast members are inclined to break out into a final elaborate song-and-dance sequence, or an English “kitchen sink” drama finish, where the camera might pan from a grim housing estate to an ash-grey sky.

In a Hollywood ending, boy and girl from opposite sides of the tracks, who might not even have liked each other at first, end up together, in love. The racecar driver and the fetching brain surgeon (Days of Thunder, Tony Scott, 1990). The baker and the accountant (Moonstruck, Norman Jewison, 1987). The kickboxer and the class brain (Say Anything, Cameron Crowe, 1989).

In Hollywood endings differences are straightened out, crimes solved, villains (psychopaths, aliens, etc.) apprehended or eliminated.

Wrongs are righted, epiphanies experienced, virginity lost, championships won.

Sometimes there are casualties en route to the happy denouement. Sacrifices are made for the greater good. For mankind.

If for instance, as in Armageddon (Michael Bay, 1998), an asteroid is hurtling towards Earth and something goes wrong (something always goes wrong) when a crack team of drillers ventures forth to obliterate it, then someone – Harry Stamper (Bruce Willis) perhaps – has to stay behind to set off the bomb by hand.

If anything, here is the Hollywood ending writ large. Tears, laughter and joy (sob) tempered – nay, enriched – by loss. Cue Aerosmith with a big corporate power ballad and our heroes returning to the waiting arms of their loving ladies.

Change the spaceship to a special vehicle made from Unobtainium and the asteroid to puzzling meteorological problems and you’ve got The Core (Jon Amiel, 2003).

In Hollywood scriptwriting manuals, a killer ending features as a key component.

A spectacular film can fail the audience in the last 10 minutes. Yet a decent conclusion – not easy to write and even more difficult to film – can make viewers forgive a lot in an otherwise so-so film.

In Adaptation (Spike Jonze, 2002) Charlie Kaufman (Nic Cage), seeks advice from screenwriting guru Robert McKee (Brian Cox) when he experiences enormous difficulty turning Susan Orlean’s work of non-fiction, The Orchid Thief, into a workable script.

The advice?

“Wow them with the ending, and make your characters change.”

This is what Kaufman, who has written himself into the script, proceeds to do, providing a final implausible and incongruous reel that includes sex, drugs, guns and alligators. It’s the ultimate spoof on silly, inappropriate Hollywood endings.

Is it a miracle, though, that the practice of happy endings has become a standard part of the mainstream movie formula?

The high brass of studios now presides over companies that are part of massive multinational, multimedia, multi-industry conglomerations. These studio heads are answerable to not only a board, but also a raft of shareholders.

It’s little wonder that in a sphere that revolves through the domains of both art and commerce, it’s the latter category that seems to more frequently influence the manner in which films are made.

The bottom line is the bottom line in the film business. That’s why risks are so seldom taken when vast budgets are at stake. It’s why so many films adhere to a cookie cutter formula, including the fairy-tale ending.

There are, of course, some exceptions to the happy ending rule. Made for TV “disease of the week” films generally are not likely to end well.

Historical dramas such as Braveheart (Mel Gibson, 1995) or Malcolm X (Spike Lee, 1992) usually play out with a certain fidelity to the facts as they are popularly known.

And the middle films of trilogies are known for being a touch darker than the other instalments.

Consider, for instance, that in both The Empire Strikes Back (Irvin Kershner, 1980) and Matrix Reloaded (Larry {now Lana} and Andy {now Lilly} Wachowski, 2003) our heroes are out for the count, and the bad guys seemingly on top, poised for victory.

It’s interesting that films such as Chinatown (Roman Polanski, 1974), The Deer Hunter (Michael Cimino, 1978), Taxi Driver (Martin Scorsese, 1976) and One Flew Over the Cuckoo’s Nest (Milos Forman, 1975), are so different to most contemporary films.Lately I’ve been returning to a few films made in what many consider to be the highpoint of creative studio filmmaking, the 1970s.

The pacing is slower, scenes are longer, the editing is less intrusive, characters are more fleshed out, stories are more complex. Also, the soundtracks are more evocative, and none of the four have what could be construed as standard plots, or indeed endings.

Why?

Some critics have suggested that America’s involvement in the Vietnam War and the exposure of government shenanigans in the Watergate scandal made audiences cynical of previously respected institutions. Film watchers were open to art that explored themes of corruption, betrayal and power.

Yet conflicts are taking place now as much as ever, and most of the films that show in multiplexes are pretty much made to standard specs.

Perhaps, in the intervening 40 years, governments acting improperly, dastardly or deceitfully has simply become standard operating practice.

Maybe now audiences predominantly seek escapism in the cinema, not engagement.

You could argue that there are a few unhappy ends in more modern Hollywood films such as The Hours (Stephen Daldry, 2002) or Far From Heaven (Todd Haynes, 2002). While critically acclaimed, neither was very popular with mass audiences.

Yet even a popular film from the 1970s such as Rocky (John Avildsen, 1976) does not depict the punchy eponymous champ winning the bout and holding aloft the title belt, as it almost certainly would if made today.

Could there be something fundamental and inherent within us that needs happy endings in our texts, to fuel our optimism? To keep us going.I read recently there are certain things that are hardwired into us as humans to ensure the survival of the race. For instance, and this could well be apocryphal, female brains are hardwired so that their pupils dilate involuntarily at the sight of babies.

I asked my friend Clare O’Farrell, a lecturer in Cultural Studies at the Queensland University of Technology, whether she thought the concept of the happy ending was something influenced by cultural factors.

Clare wrote back and said there’s a whole discussion in France about the Hollywood happy ending.

In fact, the words for happy end in French are “la happy end”, which indicates how alien it is to the culture.

“In French film of the 30s, 40s and 50s the convention was a tragic end,” Clare explains. “And in some films I have seen this is just as arbitrary as the Hollywood happy end. A film I really like with Erich von Stroheim called Macao Gaming Hell – has a warplane bomb the hero and heroine out of the water in their boat just as they have discovered their love for each other. That’s it – the end!

“The concept of the happy end is definitely both a cultural and historical thing. I’ve noticed the French idea of the perfect romance is when people die so that the perfection of the short moment can be preserved – such intensity cannot survive the attrition of the everyday.

“The Hollywood end is marriage and happily ever after, which in my view reflects the USA’s history of pioneers and the utter reliance on socially isolated family groups. It’s the pioneer idea of paradise – the eternal middle-class family with the white picket fence who take up arms to kill anybody who poses a threat to the family.

“There is also the American ideology of ‘making it’, always winning, the land of opportunity and success, and leaving behind the old unhappy decadent European ways.”

Clare also pointed me to an article by Edgar Morin called “La Happy End”, which unfortunately was in French.

Basically, Clare explained he argues that the all-pervasive happy end in contemporary times breaks with an age-old tradition that dates back to Greek tragedies and includes Elizabethan drama and the novels of Balzac, Stendhal and Zola.

Evans wanted a happy ending while Polanski insisted that this would rob the film of all its meaning. Polanski prevailed.Another European, Roman Polanski, argued with producer Robert Evans over the end of Chinatown.

Perhaps film conclusions in the 1970s were affected by the influence of the European narrative traditions and ways of seeing as much as what was happening in the geo-political sphere.

Several of the defining studio films of the 70s, like Chinatown or One Flew Over the Cuckoo’s Nest, were directed by Europeans.

Others were helmed by the first wave of film school graduates such as Francis Ford Coppola and Martin Scorsese, who were greatly influenced by European films of the 50s and 60s.

The films of the French New Wave, like their progenitors from pre-war times, often concluded in an ambiguous or decidedly downbeat manner.

By the end of the 1970s, however, it seems that the trend for films to potentially have audiences leaving the cinema melancholy, or even contemplative, had all but evaporated.

As the 1980s progressed, the decade became increasingly to be associated both with materialism and American cultural imperialism.

Having put the disgrace of Nixon well behind and elected gung-ho Ronald Reagan to office, it was poor form, for the most part, to be disenchanted with such a kick-ass president.

As a young B-grade studio film actor, Reagan had helped institutionalise a particular form of Hollywood end: the classic weepy.

In the 1950s he’d starred in Knute Rockne All-American. (Lloyd Bacon, 1940). The plot concerned a champion sportsman George Gipp, who, knowing he was not long for this world, extolled his teammates to “win just one for the Gipper” when their backs were against the wall.

With Reagan in the Whitehouse at the height of the Cold War, it was Arnold Schwarzenegger, Bruce Willis and Sylvester Stallone who were blowing or clubbing celluloid commies and/or nefarious types away at an astonishing rate.

Luke Skywalker et al were meanwhile dispatching Imperial scum.

In a variety of guises Harrison Ford took care of business. A range of humourously mismatched (young/old, black/white, dog/man) teams solved crimes and dealt summarily with wrongdoers.

Thus a template was born and nearly without exception involved a “happy” ending.

These successful films spawned franchises.

Increasingly, mainstream Hollywood fare became less bohemian, less experimental, and with more sophisticated special effects, costlier.

Marketing campaigns for films required enormous budgets and often involved synergistic tie-ins with fast food and merchandising.

By the start of the 1990s Hollywood studios could no longer afford to take major gambles with “the product”.

Hence the standard three-act formula, and the remorseless quest for safe, sure-fire hits: remakes, comic book and computer game adaptations, sequels.

There is a school of thought that turns its collective nose up at happy endings.Is a happy end – a Hollywood end – a copout? Is it, as Adaptation’s Robert McKee suggests, a facile and formulaic way to impress an audience? Is it a method of giving us what we want and sending us on our way out of the multiplexes so that we continue being happy little consumers?

Vladimir Nabokov wrote in PNIN: “There are people – amongst whom I would include myself – who detest happy endings.”

The implication is that somehow a happy ending is lowbrow, that its lack of verisimilitude is unliterary, unintellectual even.

There is an argument that a happy ending lack edge.

Certainly that was the case mounted by Quentin Tarantino during the production of True Romance (Tony Scott, 1993).

Tarantino, who wrote the script but didn’t direct, wanted the film to conclude with its hero, Clarence (Christian Slater) being gunned down in a sanguinary hail of bullets.

Scott’s preference (and no doubt the studio’s as well) was for something a little more upbeat.

Indeed, the final scenes of the film are of an eye-patch-wearing Clarence frolicking on the beach with his young son. Voiceover is provided by his new bride Alabama (Patricia Arquette). Not only has he survived the cross-country flight from scary gangster Vincenzo Coccotti (Christopher Walken), a good and fulfilling life ahead is a possibility.

It’s the classic feel-good finish. But is it a cliché?

Scott’s argument was that so often in lovers-on-the-lam movies such as Breathless (Jean-Luc Goddard 1965), the convention is for the hero to meet his (usually it’s a he) maker before the credits role. Tarantino’s preference would have in fact meant adhering to a well-worn, even hoary, generic convention.

By allowing Clarence to live, Scott has actually defied expectation and provided something of an unusual denouement for the genre: an upbeat one.

“I have nothing against happy endings as such,” he replied via email. “Happy is just as valid as unhappy. Unhappy endings can also be cliché, trite; ‘bleak’ and ‘dark’ do not necessarily mean more serious, more authentic.I asked film critic Adrian Martin his thoughts about Hollywood endings.

“What I really dislike is what I call the ‘unearned happy ending’ – where the grounds for the happiness have not really been prepared, where it feels imposed, contrived, strictly obligatory.”

But Martin also pointed out the theory of the positive “unhappy happy ending”, which is ironic and suggests the opposite of what it literally shows. The melodramas of Douglas Sirk are an example of this.

“I believe every film should be given the ‘first shot/last shot’ test,” Martin said. “Does it start well, does it end well?

“The problem with many contemporary films (especially Australian ones) is that they do not end with a bang, they ‘dribble out’: the kind of ending where people embrace or frolic or walk away into the city crowd, as the camera soars into the sky and a song plays: usually simply a cue that says to the average moviegoer: OK, the film’s over, now you can head for the back door of the cinema! These are often ‘coda’ endings (‘six months later…’), the scourge of modern cinema.”

Here I have to admit a guilty fondness for this form of conclusion, but can certainly see Martin’s point. Consider, for instance the conclusion of Minority Report (Steven Spielberg, 2001), an otherwise engaging film that has a tacked on “sometime later” conclusion described by one magazine as one of the 50 cheesiest of all time.

“I like DEFINITE endings that are either thrilling or ‘visionary’, where the film builds to a revelation or a definite moment of closure,” Martin said.

“Iranian cinema is brilliant at endings: look at (Abbas) Kiarostami classics like Through the Olive Trees (Abbas Kiarostami, 1994), or the freeze-frame smile that ends Once Upon a Time in America (Leone, 1984).

As Martin sees it, endings are all about how film makers “handle” the spectator in his or her passage from the film back out into reality, and modern commercial films try to make that transition too smooth, easy, indistinct, and non-demanding.

Says Martin: “I like an ending that implicitly says: OK, you have been within the special fantasy that is cinema, bang, the lights are now up, now you have to work out what there is to take with you into the street, into your life!”

When I think about the kinds of film endings I like, there’s certainly a place for the happy conclusion. I didn’t want to see Ripley (Sigourney Weaver) jump into a vat of molten metal at the end of Aliens 3 (David Fincher, 1992).

Dinner Rush (Bob Giraldi 2000) is one of those movies where the last 30 minutes does much to overcome some lapses in the first part of the film.

A good balance is when an ending can match the tone of the rest of the film and still surprise, confront, or at least leave me thinking.

I’ve come to really dislike the “and then I woke up” end of films such as Vanilla Sky (Cameron Crowe, 2001), or films that just don’t know when to end, such as Artificial Intelligence: AI (Steven Spielberg, 2001).

Most of the time I prefer to see movies with other people, and I am fond of the kinds of films where the ending can prompt discussion about the fate of the protagonists well after the credits roll.

Lost in Translation (Sofia Coppola, 2003) is a classic example.

It’s the type of ending that is really another kind of beginning.

 

This article first appeared in issue 35 of Australian Screen Education.

Working titles

Is a rose by another name still a rose?

How important is a film’s title to its success at the box office? Often the name of a film can be determined relatively late in the movie-making process. While the movie is in production, and sometimes even while it’s being shot, an entirely different name, called a working title, is used.

No doubt many films have benefited from a change away from their respective working titles.

Among other targets, Team America: World Police satirises US arrogance and military aggression. Yet the working title American Heroes sounds a touch earnest for a comedy and doesn’t give a sense of the film’s scope.

Snatch is more dynamic than its working title Diamonds.

You really have to see the film, though, to explain how the working title for Adaptation could be The Orchid Thief.

Everybody Comes to Rick’s just does not have the same ineffable cachet as Casablanca, and Discoland: Where the Music Never Ends is not as punchy as Can’t Stop the Music.

American Pie is more effective than East Great Falls High, while I was a Teenage Teenager (Clueless), is rather, er, clueless.

And Scream Again and Scream Louder (Scream 2) sound more like unfunny comedies (Look Who’s Talking Too, anyone?), than slasher flicks.

While writing a review of Danny Deckchair quite some time ago, I discovered that its working title was Larry Lawnchair.

Neither name is very good. The latter lacks an Australian touch and the former is so ordinary that I was dreading sitting down to watch it.

So when eventually I did see the film, I was relieved it wasn’t too bad at all.

I asked around and elicited some alternative titles that might have made the film more alluring.

Barry Bananalounge sounds as Australian as lamingtons. Paddy Pouffe, Alby Armchair, and Bernie Barstool also emerged from the brainstorming session. Clearly though, writer/director Jeff Balsmeyer missed gold by not opting for Jason Recliner.

In some instances it’s obvious why a working title was discarded.

You can understand why Cameron Crowe opted for Almost Famous over Something Real, Stillwater or The Uncool.

The former is much more inviting to an audience, more open-ended and is more evocative of the film’s engaging plot (based on Crowe’s own experiences), which deals with a teenager’s adventures on the road – and brush with celebrity – as a young music reporter.

It also becomes apparent why writer/director Todd Solendz’s grim opus was ultimately called Welcome to the Dollhouse.

It’s hard to imagine too many people excited at the prospect of seeing Faggots and Retards, as accurate as the film’s working title might be.

It seems honesty in a film title is desirable only to the point that it puts backsides on cinema seats.

This article first appeared in J-Mag.

Adapting novels for the screen

 

Books are an enduring source of material for filmmakers, but what makes for a successful transition to film?

It’s doubtful author Raymond Chandler had actor Humphrey Bogart in mind when he envisaged the character of gumshoe Philip Marlowe. If he had, Chandler would have described the private dick as a short, funny-looking guy with stilted delivery and a toupee. As it was, Chandler wrote of his sleuth hero in novels such as The Big Sleep, as a tall, slender, potentially menacing presence.

Nevertheless, when asked what he thought Hollywood had done to his novels, Chandler replied that Hollywood had done nothing to his novels.

“Look,” he said, “they’re sitting right over there on the shelf.”

In language as precise and economical as dialogue he placed in the mouth of his characters (assuming the story isn’t apocryphal), Chandler managed to convey what should be a simple message: novels and film are different media. We negotiate and interpret them differently. They offer different pleasures, require the use of different senses, affect us in different ways.

And yet we persist in comparing and contrasting the adaptation of a book into film, usually unfavourably.

We rate how faithful has been the rendering of the tome, list the important episodes that are absent in the big-screen version, and whether the actors are worthy or appropriate to deliver the lines that were originally written for paper, not multiplex. Writing in the Seattle Post-Intelligencer, William Arnold clearly considered the film version of Snow Falling on Cedars inferior to the novel.

“The book’s poetically precise prose, bold structural devices, riveting delineation of character and heartbreaking tale of anti-Japanese prejudice in 1940s Washington state established (David) Guterson as a major novelist. The film version … goes after these qualities. (It) is visually poetic, non-linear in structure and relatively uncompromising. Even though it’s a big-budget studio release, it’s very much an ‘art’ film. At the same time, it has – perhaps inevitably – lost much of the novel’s drive and originality, and its characters have, to a large extent, been reduced to movie stereotypes. As good as it is many ways, the film is not as emotionally gripping as it should be, and comes off as rather a predictable liberal statement.”

Not content to view a film adapted from a novel as a text in and of itself, we feel compelled to contrast it with the book with which it shares its name – even though the reading of each renders its own, separate rewards.

Consider the example of Alfonso Cuarón’s Great Expectations. In the film starring Ethan Hawke, Gwyneth Paltrow and Robert De Niro, the story had been shifted from 19th century England to modern-day Florida and New York. Several of the characters’ names had been changed, including that of the main protagonist from Pip to Finn. In fact, it could be argued that so little did some elements resemble the 19th century novel, including the ending, that Charles Dickens would have a tough time connecting the film with his work.

John Updike has written of experiencing just that when he struggled to recognise any similarity between the film called The Witches of Eastwick and the novel he wrote of the same name.

Similarly, were Philip K. Dick still alive, he might have marvelled at the spectacular visuals of the movie Blade Runner, but apart from a few characters whose names he created, would have struggled to connect it to the novel he wrote called Do Androids Dream of Electric Sheep?

Obviously readers of books are going to make a connection when a film is made of a particular work, which might explain why John Irving insisted the adaptation of his book, A Prayer for Owen Meaney, alter its name after the makers of the movie departed radically from the plot in his novel.

Consequently Simon Birch was released, more or less without fanfare.

Irving subsequently wrote the screenplay for the recently released film version of his novel The Cider House Rules himself.

US-based film reviewer Paul Tatara says films based on Irving’s books “often feel like two or three different stories sewn together like Frankenstein’s monster,” regardless of who writes the screenplay. Which is not to say that Irving lacks talent, but is a phenomenon that is more a reflection on the nature of his writing, which often weaves wildly disparate parts into a cohesive whole. In his novels, and other books, such a style doesn’t seem out of place. When we’re reading books, we expect them to be meandering, descriptive and elliptical. But in film, story is all, and things that often work on the page seem incongruous or unnecessary in a script.

So what makes a film a commendable adaptation, and is such an occurrence desirable, or even possible? Perhaps the adaptation of The Name of the Rose gives us the most helpful example of a way to understand the relationship between book and film.

The writer of the novel, historian and academic Umberto Eco, said there was no relationship at all. None. One was a book, one was a movie that happened to share the same name. Indeed, the two texts are rather different and after an attempt at Eco’s dense, labyrinthine work, one might wonder how anyone could even contemplate filming it. No such attempt was made. Rather, director Jean-Jacques Annaud made what he called a “palimpsest” of Eco’s book. Kind of like a medieval etch-o-sketch, a palimpsest was a piece of parchment used over and over again.

What Annaud meant was his film offered resonance, lines, traces, characters and plot elements of the novel, but was by no means an attempt to film exactly what Eco had written. Given the scope of Eco’s novel, and the sophisticated ideas and language, such an exercise would simply have been impossible.

In the case of Dickens, one of the complications in an adaptation for modern audiences is the episodic nature and arch style of the writing. Dickens was writing for an audience that was bereft of television, radio and internet. His novels were originally penned in serial form for newspapers, with intricate plots and characterisations. And while Dickens was a prodigious writer and prolific in the extreme, his style, like that of many 18th and 19th century novelists, is not readily converted to big-screen dialogue.

For example, in once describing a character grinning from ear to ear in The Pickwick Papers, Dickens wrote he “exhibited a grin that agitated his countenance from one auricular organ to the other.”

Like Irving, the very nature of Dickens’ writing – its “writerliness” and its convoluted, episodic form – makes conversion to film problematic.

Jane Campion was criticised for her film version of Henry James’ Portrait of a Lady for the most part, it seemed, because she dared, like Cuarón, to attempt (not altogether successfully) to make a film that was relevant to contemporary audiences. Reading James can provide a rewarding experience but it’s not easy-going. He uses long sentences, contrived plots and dense passages replete with internal dialogue. In short, like Eco, he’s a writer who wouldn’t appear to be a natural for film.

Like Annaud, Campion didn’t even try to “adapt” James for the screen. Rather, she presented something of a palimpsest of her own, featuring elements such as contemporary Australian schoolgirls talking about relationships at the start of the film. It was if to say, right from the beginning, “This is not a faithful adaptation.”

And where James was notoriously circumspect when writing about his characters’ sexual exploits (he refrained from writing about the topic altogether), Campion shows Isabel Archer’s (Nicole Kidman) inability to choose between two lovers by having her sharing a bed with both of them.

The best indication of whether a film adaptation has succeeded might therefore be if the film contains something of the “spirit” of the novel, and whether it has entertained, engaged or provoked – rather than how closely it resembles its source material.

Decent films have been made of relatively ordinary books (Gone with the Wind) and vice versa (Catch 22, and many more). Also, adaptations of some novels, Mario Puzo’s The Godfather and James Ellroy’s LA Confidential, for instance, have made excellent films only partly because they were based on decent books. Mostly they appeared to work because excellent directors (Francis Ford Coppola and Curtis Hanson, respectively) worked with fine writers, actors, cinematographers, technicians, editors and so on to produce highly regarded movies.

What Puzo, and Irving and others must have discovered preparing novels for the screen is the vastly different roles writers of novels and scripts have. In penning a book, the novelist is screenwriter, cameraman, director, costumier and musician.

But in writing a script the writer is but one contributor in what is a decidedly collaborative process. Further, a scriptwriter provides just enough for actors and directors to work with, not an elaborate set of instructions. And when scriptwriters consider a novel something of a holy text that must be adhered to as much as possible, the final product often doesn’t work.

Speaking at the Melbourne Film Festival following a screening of his film Brown’s Requiem, director Jamie Freeman described the process of turning James Ellroy’s considerably flawed debut novel into a film script. The first step was to go through book and highlight all the parts (passages, characters, plot devices, dialogue) that initially appealed to him. As he fashioned these elements into a workable film script, he cut out the pieces – part by unwieldy part – that couldn’t, or wouldn’t be made to fit in. Characters were lost, plot modified, chunks of dialogue discarded.

Then as the financial constraints really started to kick in (probably about the time he decided to use a significant portion of the budget on an early 60s convertible), Freeland decided he’d ditch one of lead character Fritz Brown’s defining characteristics: his love of classical music.

Yet in attempting to include too many scenes and characters, Freeland made a movie that was both sluggish and confusing. Sure, the lowly budget can’t have helped, but ultimately the result of Freeland’s approach was a film that didn’t come across as particularly Ellroy-esque, nor engaging.

It’s to be expected some fans of certain novelists are going to have their noses put out of joint by films that don’t live up to their expectations of how so-called great literary works should be represented. When Canadian director Patricia Rozena’s film version of Mansfield Park was criticised by fans of the Jane Austen book for making the heroine Fanny Price (Frances O’Connor) somewhat raunchier, Rozena responded thus: “I enjoy Jane Austen very much as an author, but it all felt vaguely twee to me.”

It’s the flipside to Chandler’s comment. What Rozena is basically saying is, if you want to read Austen’s novels, they’re sitting over there on the shelf. A film is something else altogether. Deal with it.

This article originally appeared over at The Urban Cinefile.

 

This avuncular life

There is a lot of upside to being an uncle.

Written by author George Vaillant, Ageing Well, as the title suggests, explores the process of getting older with dignity, health and happiness.

Drawing upon an extensive Harvard study that examined the lives of 800 or so individuals over a 50-year span, the book attempts to unlock the secrets to ensuring a good and meaningful life into the twilight years.

One of the questions Vaillant asks those who participated in the study is, “What have you learned from your children?”

Some people don’t get it, thinking the question should be phrased around the other way.

Vaillant was writing about adults learning from their grown up children, but in applying the theory to my two-year-old niece Billee, I can see where he’s coming from.

Billee is usually in a good mood, and loves to laugh.

She’s very curious, with almost everything in her life an adventure.

In two years she’s learned the rudiments of a language, and is constantly improving her vocabulary.

“Bye Baz,” she said to her grandad recently. “See you tomorrow.” It was the first time she’d strung so many words together in a sentence.

That’s one of the remarkable things about Billee: every time you see her she’s grown and changed from the time before. Literally of course, she’s adding kilos and centimetres at an astonishing rate.

Indeed, in the first few months of her life all Billee seemed to be doing was eating (well, you know, absorbing sustenance), sleeping and soiling nappies. All her energy, it appeared, was dedicated to growing physically and intellectually.

Then at about the four or five-month mark it all started to happen. Suddenly here was a little person, who had her own personality and foibles. Well, with visits separated by one or two-week intervals, it certainly seemed a very sudden change.

When, at about nine months Billee started walking, there was no stopping her. There was so much to discover.

No wonder little kids take naps so often.

There’s all that activity to recover from; all that roving, playing, and mental inventory taking must really exact a toll.

Billee doesn’t have regrets, or rue lost opportunities, or fret about the future. All her energy is focused on getting the most out of the present moment, or the fun things tomorrow might bring.

She likes to sing and dance – in public or private, it doesn’t really matter. She’s not self-conscious and hasn’t learned to be embarrassed. Haven’t got the lyrics quite right? No worries.

Making friends is easy for Billee, and she’s fond of public displays of affection for those she’s closest to.

Billee is very change-ready. Sure, she has her routines, her rituals and favourite things, but she also readily learns and takes on new skills and adjusts to changing circumstances.

Past failures don’t faze her or hold her back. It’s as though she’s forgotten them completely!

There’s a regular flow of fresh stuff to be learned about, played with, observed, or admittedly in some instances, destroyed.

She’s in touch with her playful inner child.

Now I’m aware that I’m taking somewhat of an idealistic approach here. I usually see my niece when she’s at her best, and I’m not required to discipline her, deal with her teething, or change nappies.

Young parents would also no doubt say that the average two-year-old has plenty to teach about throwing a tantrum, staying awake when they should be asleep, or wandering into places they shouldn’t.

As an uncle, it’s all upside. There’s play and hugs, maybe a little book reading, and then you get to say goodbye and take a breather.

It’s hardly her fault, but Billee has also stoked my competitive avuncular instincts.

When I see a toddler these days, I can’t help but think my niece is cuter, bigger, more advanced, healthier, smarter, happier or just an all round better little kid.

Yet such trifling and petty things matter little to Billee, she’s got so much else going on.

There’s playgroup, swimming lessons, dolls, handbags (she LOVES handbags), vegemite (MITE!), mini maestros, the backyard and beyond.

Little brother Ned, for instance, is a whole new source of amusement, potential play companion and partner in crime.

But he’s another story.

 

I originally wrote this piece 12 years ago. It’s hard to account for time passing so quickly. Billee and Ned are now both in high school, and are turning into delightful young adults.

The write stuff

Many successful writers have a trait in common: tenacious productivity.

 

In the forward to Charlie Martz and Other Stories: the Unpublished Stories of Elmore Leonard, Peter Leonard recalls his renowned father’s writing routine.

This was in the time before Elmore was able to devote himself to novel-writing full-time. Rising at 5am, Elmore set a rule for himself: fill two pages of writing before going to work at his day job producing copy at a Detroit advertising agency.

Astonishingly, the writer of Rum Punch (Jackie Brown), Get Shorty, and Out of Sight – which were all adapted into screenplays – kept up this stringent quotidian routine for almost 10 years.

So disciplined was Leonard during his morning ritual that he didn’t permit himself to turn on the water for his coffee until he had filled a page with his hand-written script.

I have been thinking somewhat about writers and their routines lately.

When I started blogging earlier this year I did so with the aim of producing one post per week. Alas, I’ve fallen short of that modest aspiration.

So how do writers – real writers – do it? By what elusive alchemy, what kind of graft enables productive, successful writers to link words into phrases, paragraphs, chapters and books that touch emotions, transport, engage, or at least entertain?

With some it obviously comes very naturally. JK Rowling has no problem cranking out sentence after virtuoso sentence. (I haven’t read the Harry Potter series, but have found the Robert Galbraith books to be enormously addictive).

Other writers overcome formidable hurdles to tell their stories.

Consider the example of Jean-Dominique Bauby. The French magazine editor wrote a single, slender novel, but it’s a masterpiece.

The Diving Bell and the Butterfly was written after Bauby suffered a massive stroke. Following his emergence from a three-week coma, Bauby was left with “Locked-in” syndrome. Speechless, immobile and bed-ridden, the only part of his body he had control over was his left eyelid.

Yet using a convoluted and laborious system, Bauby was able to communicate with the outside world, and write his book. With the help of an assistant reciting the alphabet while taking dictation, Bauby would blink when the letter he wished to use was uttered.

Imagine the force of will required to write a sentence or paragraph – let alone a whole novel – in this style. It must have taken incredible concentration, fortitude and rigour, as well as skill, for Bauby to persist in this manner.

The result, though, is a treasure – an amazing story beautifully told, every sentence a lovely bagatelle.

Fact: I am not going to reach that standard (even with most of my faculties intact) – few do, including scribes such as Leonard, whose stock-in-trade was meticulously structured plots, well-drawn characters, and snappy dialogue.

There is a useful lesson for us workaday bob-a-job writers to keep in mind: don’t let perfection – or the pursuit of it – get in the way of tapping out a good yarn.

“If we make writing mystical, we place it out of our control, we give ourselves another reason not to do it,” says novelist Laura Lippman.

“If we hold our ideas to the standard of blinding love at first sight, then they will be few and far between.”

Ideas, she means.

It’s advice that nicely complements that given by Dr Nick Baylis, a specialist in positive psychology from Cambridge University.

Baylis says that to combat the paralysing effects of perfectionism, a powerful principle to keep in mind is that the more fully formed attempts we make at any task (not just writing), then the greater the probability of scoring a recognised success.

In basketball parlance: you miss 100 per cent of the shots you don’t take.

The idea is not so much that quantity trumps quality; it’s more that quantity could, and just might (and probably will) – inevitably, inexorably, or simply hopefully – lead to quality of some sort.

Theorising about, studying, discussing, benchmarking – these are no substitute for actually making, for doing.

Baylis quotes Dean Keith Simonton from the University of California, whose analysis of accomplished lives demonstrates that the basic rule is consistent across all disciplines, and applies at any age of life.

Simonton provides the example of William Shakespeare, who penned Hamlet one year, Troilus and Cressida the next, and close to 40 major works in all. About a quarter of these are celebrated as part of the Western Canon of great works; others have slipped into obscurity and are rarely performed.

The Bard didn’t know which of his plays would succeed and which would eventually fade, and it wasn’t a concern; productivity was king. The Globe Theatre required material, and he was the content provider. So he sat his backside down, pulled out his quill and filled parchment on demand. 1

“History shows how every high achiever relies on this same brand of tenacious productivity to eventually make progress,” Baylis says. “Their most prized accomplishments are invariably surrounded by a vast number of missed shots.”

To overcome perfectionism’s pernicious influence, we should aim to be as productive as possible, not as perfect as possible. 2

Think: by producing a lot of stuff, you might just come up with the occasional pearl.

“Productivity,” says Baylis, the author of Learning from Wonderful Lives, “brings a profound pleasure.”

One has a sense that this was certainly the case with Leonard, who according to his son, could get lost in his writing regardless of what was going on around him.

His was a writing style that evolved over time, and with considerable practice.

By the time he’d earned the title of “master crime writer” – at least a few decades into his career – Leonard had developed his famous “10 rules of writing”. 3

It took trial and error for him to hone these, and to find his self-styled voice.

When he was ready to make run into full-time novel writing, Leonard had accumulated about 7,300 hours of practice in his early-morning sessions alone.

Old “Dutch” Leonard didn’t have truck with flowery prose. He didn’t sit around waiting for the writing muses to visit; he just got it done. And in at least one major category Leonard even out-Shakespeared the Bard: 45 major works to 37.

 

Notes

  1. Interestingly, Shakespeare didn’t often get distracted by his Twitter feed, and wasn’t overly concerned with how his productions trended on Facebook. His Instagram regularly went neglected, and his LinkedIn profile was an afterthought at best.
  2. Perhaps there is a corollary between Baylis’s “Shakespeare principle” and Malcolm Gladwell’s assertion that success at the highest levels of anything requires 10,000 hours of practice.
  3. According to Leonard, writers should never start their stories with descriptions of the weather, only use the verb “said” to carry dialogue, and stay away from adverbs to modify “said”. He didn’t like exclamation marks, and advised against detailed descriptions of characters. His most important rule: If it sounded like writing, Leonard rewrote it.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Neglected gems #1

They might be short on budget, star power, special effects and kudos, but these films still deliver.

Bottle Rocket (Wes Anderson, 1996)
The first feature from Wes (Rushmore [1998]) Anderson, Bottle Rocket is the very funny tale of two likeable losers, Dignan and Anthony (real life brothers Owen and Luke Wilson), who think that in order to experience adventure and excitement, they must become embroiled in a life of crime and danger. The film’s strength is its gentle humour and Anderson’s genuine affection for its characters. No less an authority than Martin Scorsese is a fan of Anderson’s work: “Wes Anderson … has a very special kind of talent”, said Scorsese. “He knows how to convey the simple joys and interactions between people so well and with such richness. This kind of sensibility is rare in movies.”

Trees Lounge (Steve Buscemi, 1996)
Taking a break from the bit parts and walk ons that have constituted the bulk of his career, Steve Buscemi is front and centre in Trees Lounge, a slice of life drama set in New Jersey that the fast talking actor wrote and directed. Buscemi is Tommy, a barfly and screwup who spends his considerable spare time drinking at the Trees or when things really hit rock bottom, driving his uncle’s ice-cream truck. Funny in a bittersweet way, it’s an original and rewarding addition to the Buscemi canon.

Say Anything. (Cameron Crowe, 1989)
The last and best of the John Cusack teen films, with The Sure Thing (Rob Reiner, 1985) and Better Off Dead (Savage Steve Holland, 1985) also worth a look by fans of the genre. In this superbly penned Cameron Crowe film, Cusack plays Lloyd Dobler, a likeable kickboxer who falls for the class brain and beauty. Say Anything. is an endearing film full of keenly observed characters and some winning dialogue.

Liebestraum (Mike Figgis, 1991)
Before Mike Figgis went all experimental with Timecode (2000) and The Loss of Sexual Innocence (1999), his raison d’être was making striking, sumptuous films populated by beautiful thespians. Liebestraum is certainly of that ilk. Kind of a neo-noir working across two timezones, it’s the story of a brace of affairs, a stately building and the connection between them.

The Boxer (Jim Sheridan, 1997)
From Knute-Rockne All-American through Rocky and on to The Replacements, the classic sports film formula has been one of teams or individuals using the arena as a place of redemption. The Boxer, though, is a searing drama set amid ‘the Troubles’ of Northern Ireland that both re-defines and avoids the clichés of the genre. Danny Flynn (Daniel Day Lewis) is a talented boxer who has spent the better part of his adult life in jail. Upon release his attempts to build a new life in the ring and outside it are thwarted by the same forces he protected by serving hard time. Delivering his lines as convincingly as his uppercuts, Day Lewis’ performance as a pugilist is the best since Robert De Niro’s as Jake La Motta in Raging Bull (1980).

The Final Combat (Luc Besson, 1983)
Occasionally turning up at odd hours on television, The Final Combat was made by Luc Besson before he went on to make the familiar The Big Blue (1988), Nikita (1990) and The Professional (1994). Reminiscent of Mad Max 2 (George Miller, 1981) (there’s even a gyrocopter) the film is set in a post-apocalyptic wasteland where some unknown catastrophe has caused the destruction of society as we know it, fish to fall from the sky and all the characters to lose their voices. While there’s music and incidental noise, there’s no dialogue. Shot in black and white and featuring the work of Besson regulars, including the charismatic Jean Reno as a villain (The Brute), The Final Combat is an intriguing late night delight.

Mediterraneo (Gabriele Salvatores, 1991)
Not a whole lot happens in Mediterraneo, a calmly paced, lyrical film directed by Gabriele Salvatores that won the Academy Award for Best Foreign Film. It’s the story of an Italian army unit that’s sent to guard a picturesque Greek island during World War 2. But cut off from their superiors and with their radio destroyed, the bumbling unit settle gradually into daily life, spending their time with the pulchritudinous prostitute Vasilissa (Vanna Barba), playing soccer, or in the case of the lieutenant, painting the local chapel. It’s an easy film to watch. Predicated on character and atmosphere, the beautiful setting plays as big as role as any of actors.

Bound (Andy & Larry Wachowski, 1996)
A moody heist film with a sapphic flavour from the Wachhowski brothers, the team that also delivered The Matrix (1999), Bound is an engaging neo-noir that’s as stylish as it is unpredictable. Vioilet (Jennifer Tilly) is your classic femme fatale just waiting for a chance to betray the coarse Caesar (Joe Pantoliano). Corky (Gina Gershon) is the insouciant ex-con who gets dragged into a complex web.

Big Night (Campbell Scott & Stanley Tucci, 1995)
Set in ’50s suburban New Jersey, Big Night is the story of two brothers, Primo (Tony Shaloub) and Secondo (Stanley Tucci, who directed with Campbell Scott) and their attempts to run a restaurant that is culinary-wise ahead of its time. As might be expected, the food is an integral and evocative component of the movie, which is also served well by a jumping soundtrack, some sterling acting, a witty script and an unconventionally moderate pace. The final scene is a truly inspired piece of filmmaking.

Down By Law (Jim Jarmusch, 1986)
A prison yarn for the most part, Down by Law is the quirky (what would you expect from Jim Jarmusch?) tale of three jailbirds. Jack (Tom Waits) is a hip but low-rent DJ, Zack (John Lurie) a smooth pimp and Roberto (Roberto Benigni), a poetry-reading Italian immigrant convicted of murder, find themselves trapped in the same cell in a Louisiana Bayou prison. Neither prison time nor the film move particularly quickly but gradually layers are added and the whole mood of the picture alters. Worth a viewing for the “I scream, you scream, we all scream for ice cream” scene.

The unkindest cut: directors and audiences

Filmmakers don’t always get their own way.

Having helmed such critically and commercial successes as Thelma and Louise, Black Hawk Down and Alien, it seems remarkable now that director Ridley Scott didn’t have things all his own way with Blade Runner (1982).

After all, we are talking about an artist who has earned a formidable reputation as a creator of big-canvas filmic epics such as Gladiator. And it’s a given he now has the prestige and the power to hold sway on any project he undertakes.

Yet early in his career while he was still establishing his name in feature film directing, Scott reluctantly made changes to a film many regard as one of the finest of the 1980s.

Even today, 30 years after it was made, the film’s look and tone depicting a dystopian future in a dark, menacing and polyglot LA, still seems futuristic (and bleak).

After the film was shown to test audiences, however, the producers of the project were unsatisfied with the manner in which it was received.

Too confusing, test audiences groused.

The end result was that a couple of changes were ordered.

The original ending, for instance, didn’t gel with Scott’s vision, a key scene was omitted while the voiceover supplied by Deckert (Harrison Ford) was also not to Scott’s liking. The film was released, met with a mediocre response and subsequently earned wider appreciation in arthouses and upon its release to video.

It was only much later, after Scott had underscored his considerable reputation as both a stylist and maker of commercially successfully movies that he was able to finally release Blade Runner in the manner he originally intended for it to be seen.

The director’s cut, as the name implies, is a version of the film the director, rather than the studio, the producers or anyone else, wished it to be made.

Of course, long before Blade Runner Director’s Cut was released, various versions of films were available.

Sometimes different version of a films are released in different countries to meet certain rating standards, scenes are added or lost for television and so on. For instance, the Australian version of On Any Given Sunday had about 10 less minutes of gridiron footage than the US version.

And by 1992, a full 10 years had passed since Blade Runner had originally been released, time for a whole new audience to have grown up, ready to join admirers of the original.

So as much as Scott’s new cut represented the unveiling of an artist’s uncorrupted vision, it was doubtless a whole new revenue stream for a product that didn’t reach its potential upon release.

That would appear to be the category in which most directors’ cuts fall. Consider the example of Steven Spielberg’s Close Encounters Special Edition, James Cameron’s director’s cuts of Aliens, The Abyss and Terminator 2: Judgement Day and the anniversary editions of the George Lucas’ original Star Wars trilogy.

It’s hard to imagine these titans of film not getting their way when their movies were first made. Rather, the cuts of their films are akin to an “added extras” version of a car.

Sure viewers get a longer movie, but is that necessarily better? Is more more?

Often, sadly, no.

Take the example of T2: Judgement Day, Director’s Cut. Quite a few scenes run longer and several new ones have been woven into the fabric of the film. True, some add to our understanding of the characters.  The downside is a compelling film is slowed down somewhat.

The same could be said for the Director’s cut of Aliens, which seems lugubrious when compared to the viscerally exciting original. We gain little from the additions and those gains (for instance, learning Ripley had a daughter who died) are offset by the deleterious effect on pacing.

The latter cut of Blade Runner, however, is imbued with a whole different complexion following the changes made. Though the differences between the two versions are quite subtle, clearly we’re talking about two quite different films.

Doubtless such would also be the case in those cut made by directors who were sacked from films they were working on – that is, if they ever had the chance to make them.

One interesting example might be a director’s cut of American History X prepared by the controversial Tony Kaye.

Kaye’s original vision for the film was wildly different from that of the producers and star Edward Norton, whom Kaye later referred to as “lice.”

In an artistic battle fought out in the Hollywood press, the studio claimed Kaye’s original cut was too short. Kaye countered by spending more than $US one million editing the film and taking out advertisements outlining his stand.

It didn’t wash, and his relationship with Norton deteriorated.

“Norton’s ego and narcissism kind of manoeuvred it (the structure of the film) totally to his wants, really,” Kaye says. “Not so much through the shooting, because I got everything that I needed to get, but when it came to the editing process, he manoeuvred himself into the cutting room and really caused me considerable grief.”

So disenchanted did he become with the editing process that Kaye wanted to remove his name from the credits and replace it with Alan Smithee, the sobriquet customarily used when directors no longer want to be associated with a project.

When that wasn’t allowed, Kaye tried to get his name replaced with that of Humpty Dumpty, also to no avail. For Kaye, this act summarised his relationship with the producers, with Norton and with Hollywood.

“Actually when I pulled the word Humpty Dumpty out of the air, I didn’t realise that Humpty Dumpty is basically a metaphor for mankind,” Kaye says. “And to me, that’s not far away from this whole scenario, because truth and honesty and integrity and respect are not words that any of these people live by. And I think Hollywood right now, maybe it’s always been like this, but it’s really lost a sense of what reality is.

“And I believe that when you make a film or when you put a story in pictures and sound on a screen in a theatre, it has to be real. And if the filmmakers have lost the notion of what reality is and authenticity is, then that work can never ever be good. Because they’ve lost the intuitive sense of how to judge the work.”

 

Final say in the editing process is obviously the most important factor in determining how closely the film that’s released resembles what the director originally had in mind.

Historically it wasn’t unusual for the bean counters on a project to insist on changes, and Orson Welles was one auteur who constantly fought (and lost) artistic battles over his projects.

Even as long ago as when Welles was making films, a potentially powerful element was already part of the editing process: that of the test audience.

Australian director Richard Franklin (FX2, Psycho 2, Hotel Sorrento) describes Welles’ The Magnificent Ambersons as “probably the major casualty of audience testing in film history.”

(See Franklin’s seminal article, Cinema Papers 95, October 1993).

One test audience member wrote on their card, “As bad if not worse than Citizen Kane.”

Another wrote, “Audiences want a laff,” and this response was given three times the weighting of another comment (saying) “possibly the greatest piece of cinema ever.”

A test audience wanted the song Over the Rainbow cut from The Wizard of Oz (1939) because it was considered the scene it appeared in slowed down the film too much.

Supposedly a group selected to represent a film’s target demographic, a test audience can give a serious high five or thumbs down to characters, plot elements, music, or denouement of a movie.

A test screening audience didn’t like that Samuel L. Jackson’s’ sartorially resplendent character in Renny Harlin’s The Long Kiss Goodnight was rubbed out before the final credits rolled.

So even though earlier scenes had shown Jackson taking enough lead to kill his character several times over, the actors were ordered back to the set and new scenes showing Jackson’s character bravely surviving were shot.

A test audiences was also responsible for having Glenn Close’s character in Fatal Attraction killed off.

In essence, test audiences members are no longer regarded as people who get to experience a finished piece of art, but rather consumers, who like diners in a restaurant, are able to send back what has been served to them if it’s not to their liking.

In Franklin’s opinion, studios using test audience results to ride roughshod over directors “is about power and not about art.”

He considers allowing people who don’t understand the movie-making process to give their opinion on how a picture might be changed is like asking folks in off the street to try some amateur brain surgery.

On the other hand, Franklin is in favour of using previewing that has meaning to him.

“That can be as simple as showing it to two or three trusted friends or as complex as screening it for an audience of 50 people in which half are known to me and half are not,” explains Franklin, who has even been known to stop the projector and ask questions during a screening.

“But never do I use the anonymous process. That is, someone has to give me a reason why someone won’t put their name on the form. If I don’t understand their comment or I think they’ve got a point, I actually follow them up and talk to them.”

Would Franklin make changes to his films based on comments from audience members?

“Of course, why would you have a screening otherwise?” Franklin says. “You don’t so it to make changes, you do it to learn about how your ideas are communicating. And sometimes you can achieve that best by just showing it to one friend.”

 

Australian director Phillip Noyce (The Bone Collector, Patriot Games, Clear and Present Danger) is one director not necessarily in favour of the testing process as it used in the US.

“In the past 10 years, the studios used the test screening process as a baseball bat to knock the filmmakers over the head and beat them into homogeneity,” Noyce told The Age.

“Now it’s arguable, at least until recently, that a few hundred teenagers in the Valley in Los Angeles have as much power as the studio heads do in terms of finally affecting the movies that actually appear on screen.

“The test-screening process in America has been taken to the nth degree – and that’s n for nitwit.”

For directors without control over the final cut – and in a very competitive business it would seem there are many who fall into that category – the testing process could loom as a potentially intrusive element in the creative process.

Sure, filmmakers must be accustomed to their art form being a collaborative one, with an often eclectic group contributing to the end product. But when a test audience via a studio demands changes be made, it could hardly be considered a constructive element. Not for the director, anyway.

“It’s much easier to embrace the whole testing process when you know that you ultimately control the final cut on your movies,” director Ron Howard (Parenthood, Backdraft) once explained.

“But it’s frightening if you’re in a position where you’re going to show the movie at a preview and somebody else is going to take the results of that preview and re-cut the film based on that, maybe consulting you or maybe not. That’s terrifying.”

 

It’s easy to understand why a studio would want to control as much of the filmmaking process as possible, of course. So much money is invested in major Hollywood productions that the failure of even a single big-budget feature can have a devastating impact on a studio’s bottom line.

Little wonder, then, that every tool at a studio’s disposal is utilised to make a film “work”.

Would audiences have flocked to Pretty Woman in the numbers they did had Richard Gere and Julia Roberts parted at the end of the film before a test audience effectively changed the conclusion? We’ll never know.

On the other hand, countless changes to films have been made at the behest of test audiences that we’ll also likely never know about, for better or worse.

So is there some way to avoid Noyce’s description of the Hollywood process, where the testing system has an unhealthy presence?

How about director’s cuts for every director unhappy with the version that hits our screens?

Didn’t think so.

Perhaps the emergence of DVDs will go some way to helping viewers understand the difficulty directors (and studios) face in editing films.

When in 1996 final cut was taken away from Paul Thomas Anderson’s Hard Eight (aka Sydney) and the film re-edited, the director was a virtual unknown.

But having since enjoyed a measure of critical and commercial success with Boogie Nights and Magnolia, a director’s cut DVD of Hard Eight is now available, showing the film as it was originally meant to be seen.

Yet perhaps the onus for taking clout away from those California teenagers Noyce was referring to belongs to us.

Every time we stay away from a piece of formulaic, derivative dross, and on every occasion we embrace those films that defy convention, mess with the standard template and resist categorisation, we have a say in the type of films that will be made in the future.

 

This article first appeared in issue no. 127/128 Autumn/winter 2001 of Metro magazine.

Postscript:

You don’t hear so much about either audience testing or director’s cuts of movies these days. Perhaps Hollywood has become so risk-averse that the kinds of films that might elicit an unfavourable response just can’t be made inside the studio system anymore. Ridley Scott continued to fiddle-faddle around with Blade Runner until he produced The Final Cut in 2007.  There is talk of a sequel in 2017 starring Harrison Ford and Ryan Gosling.

My life as a MiHLF

Observations on life in a new demographic.

There is really no point denying it any longer, so I may as well fess up: I am a MiHLF. That’s right, a MiHLF, or Man in His Late Forties.

A month or so ago, before January 16, things felt different. Back then, when I was (at 46) still in my mid 40s, well, I was full of (relative) youthful exuberance, and loving life. Now I am acutely conscious that my next birthday of note is the big five-oh. You can’t argue with a number like that. A half-century on this planet is no small integer. Five decades. Two score and 10 years.

It’s not the only big milestone of note, either, with 2016 the 30th anniversary of my high school graduation. Thirty years – how is that even possible?

It’s funny (but not ha-ha funny) how the passage of a few weeks, or even days, changes one’s perspective. And it’s not that I feel bad – far from it. I am in a stable relationship, am buoyed by a close-knit family and group of friends, and am fortunate that it doesn’t take too much to keep me happy. A good workout, nice cup of tea, or an excellently crafted sentence to read will often do the trick. There is a roof over my head, and I don’t want for sustenance.

Still, I can’t deny that my entry into the (hardly exclusive) realm of the MiHLFs hasn’t been a little unsettling.

There are world leaders, corporate titans and successful sports coaches who are my age and younger. MiHLF musicians are conducting revival tours, playing in out-of-the-way venues and shamelessly topping up their superannuation with one more run through the old song list.

To some degree I agree with the adage that you’re as young as you feel. There are some individuals whose verve for life, dynamism and self-care has them acting and appearing far younger than their chronological age would suggest they should.

Others are old before their time: young fud-duds whose inflexible attitudes, unhealthy habits and perhaps life circumstances have taken a physical and psychological toll.

Yet aging needn’t be all bad. With age comes wisdom, even if the epiphanies we experience are often bitter sweet.

Age is an enemy in some ways but a friend in others,” says writer Mike Sager. “People who rely on their minds should get better and smarter with age.”

Of course, turning 47 wasn’t a complete surprise. It has been a case of “gradually then suddenly”, like a frog being boiled alive.

The signs have been evident for a while. The 5kg I pt on in my early 40s has stayed stubbornly adhered to my midsection, like the detritus of a conjoined twin. I’ve been wearing glasses for a while, but lately my eyes have been noticeably worse. My knees create their own sound effects, a cacophony of grinding noises known as crepitus (perhaps from the same Latin root as “decrepit”?)

My cholesterol has been creeping up for a while.

Grey hairs are becoming increasingly prominent, although evidently this is something of an optical illusion; hairs are either their natural colour or white, with the palette of grey determined by the number of whites in a particular patch. Still, I am starting to sport the fluffy George Negus “senior statesman” white-sideburns look.

As if to accompany this, I’ve developed a fondness for cardigans. Can sports coats with elbow patches be far behind?

Some years ago I determined that whatever levels of athleticism and rate of recovery I had enjoyed in previous decades had diminished (kind of a no-brainer that one), and that I had to respond accordingly. This has led to my patented 45-minute Warm-Up Routine, a series of exercises that enables me to have a decent sweat without tearing a calf or groin muscle. I do my “warm-up” and then go home in one piece. Clearly, however, things must be stepped up if that 5kg is to be dislodged.

The key, as all those ageing studies have told us, is to keep moving. Renowned long livers the Okinawans don’t even have a word for “retirement”. They just keep on keeping on in the form of manual labour on their farms or martial arts.

It’s important to differentiate here between the MiHLF and the MAMiL (middle-aged man in lycra). Don’t get me wrong, I applaud the efforts of my cycling contemporaries to be fit and spry – good on them. But sometimes those chaps have an air of desperation and obsession regarding their exertions – that if they ride hard or often enough and in the right suburban peloton wearing the best gear and riding an inordinately expensive bike – they can somehow, Rupert-Murdoch like, stave off the grim reaper for ever.

But, really, what’s with the completely matching lycra ensemble? Wearing a head-to-toe authentic Tour de France or manufacturer’s riding kit is the equivalent of me heading down to my local court to shoot hoops entirely in Boston Celtics gear. No one wants to see this, least of all me.

Then again, if the passing of years tells us anything it’s that we may as well wear what we like. Our time allotment on this planet is small, and time marches on inexorably.

We MiHLFs need to stick together.

Catchphrase and cunning plans

During a recent televised game of English soccer, a routine play was described thus: “They had a cunning plan.”

In fact there wasn’t anything especially crafty about the tactics on display – one player passed to another who immediately attempted to kick a straightforward goal. And this was precisely the commentator’s point.

In comedy Blackadder, when servant Baldrick declared to his master that he had a cunning plan, it invariably transpired the strategy he had in mind was nothing of the sort.

That the phrase would have been recognised by many soccer fans reflects both the show’s popularity and the tendency for our everyday language to absorb words and phrases from the texts we consume.

Our written culture has long influenced the way we communicate, phrases from books and plays seeping into the landscape of contemporary language, which changes constantly as new words and expressions are added and others fall out of use.

Take the expression “sour grapes,” which dates from the Aesop’s Fable about a fox, who, unable to reach a bunch of the fruit perched high on the vine, declares them not to his taste.

Outside of its profound historical and religious influence, the Bible has offered plenty to the texture of the way we speak.

Expressions such as, “woe is me,” “by the skin of your teeth,” “living off the fat of the land,” “no rest for the wicked,” “bite the dust,” “the writing is on the wall and “the powers that be” originate in the Good Book.

A slew of expressions either written or popularised by William Shakespeare have also filtered into our lingo.

“Laughing stock,” “sea change,” “bloody minded,” “cold comfort,” “foul play” and “good riddance” all featured in plays written by the Bard.

Though he was long gone by the time the expression “to steal one’s thunder” was coined, Shakespeare is also partly responsible for its genesis.

The story goes that John Dennis, something of a theatrical all-rounder who wrote and directed plays and managed theatre companies in the 1700s, invented a device that made a nifty stage thunder effect. It was used in a play Dennis wrote called Appius and Virginia.

The play, however, was not a success and was soon taken off in favour of a production of Macbeth, a sure-fire hit.

Dennis went to the opening night of the play and was shocked to hear his thunder machine being used.

He leapt to his feet and shouted, “That is my thunder, by God; the villains will play my thunder but not my play!”

Like many sayings that have stood the test of time, it has been refined and made punchier, but remained in use.

So will popular culture prove as prolific in its contribution to the spoken and written word?

It seems unlikely, at least in the long term.

In the first place, regardless of how powerful or popular a film, TV program, radio broadcast or even Internet site is, it’s not going to have the cachet or reach of the Bible or the works of that chap from Stratford-Upon-Avon.

Of course, popular culture indubitably shapes the way we communicate, and you only have to listen to a bunch of schoolkids talking to, like, totally realise that, like, television so does have an influence.

Yet the Oxford Dictionary of Catchphrase suggests that although many grabs from the big and small screen seize the popular imagination, most have an ephemeral existence, and are rarely used beyond a program’s run.

And that’s probably why you don’t hear folks exclaim, “She goes, she goes … she just goes,” “correctamundo,” or “schwing,” much these days. They are all horribly dated. Of their time.

Once a ubiquitous replacement for harsher language, even Homer Simpson’s “D’oh” looks like it might be going the way of the Fred Flintstone line, “Yabba-dabba-do!”

Seinfeld’s substitute for blah blah blah – “Yada, yada, yada” – has likewise fallen out of use.

Similarly, you don’t hear about folks working on their Penske files much these days. But perhaps this will change following the opening of a bar in Fitzroy named in honour of the show’s George Costanza character.

The word “muggle” from JK Rowling’s extraordinarily popular Harry Potter book series was added to the Oxford English Dictionary early in the new millennium. In the books a muggle means someone who can’t practice magic, but it can also be used to describe anyone who is accident- prone or unable to master a skill. But it has quickly fallen out of use.

Many expressions that do have a longer shelf life, say, “Go ahead, make my day,” “I’ll be back,” or “Missed it by that much,” are better described as quotes, and are usually executed in very poor imitations of Dirty Harry, Arnold Schwarzenegger’s Terminator and secret agent Maxwell Smart, respectively.

Yet when we say, “one fell swoop,” or, “pound of flesh,” we don’t use a Scottish burr just because they originated in Macbeth. They’ve become a fully-fledged part of the language.

Perhaps it’s simply a case that if an expression is in use long enough, its origin is forgotten, and knowledge of it therefore unnecessary.

A case in point is the expression, “keeping up with the Joneses”, which was the title of a comic strip that was first published in New York in 1913. One doesn’t have to be familiar with its origins to understand that it means staying fashionable, keeping up to date with trends.

It seems there is no rhyme nor reason (a phrase first recorded by John Russell in The Boke of Nurture, circa 1460) for how language involves.

Predicting which phrases have legs (so to speak) is therefore problematic. Guesswork, at best.

The signature catchphrase from Jerry Maguire (Cameron Crowe, 1996), “Show me the money,” a demand for employers to cough up large amounts of cash, looks like it might have durability.

Writer and director Cameron Crowe put it in the film after hearing a real-life football player use it.

In the movie the phrase was shouted by gridiron star Rod Tidwell (Cuba Gooding Jnr) to his agent, the eponymous character played by Tom Cruise.

Another phrase that originated in the arena of sports, “The opera ain’t over ‘til the fat lady sings,” was coined in 1978 by sports broadcaster Dan Cook after the first of a seven-game basketball series.

Like the “stealing thunder” phrase, it’s been refined over the years to become simply, “It ain’t over till the fat lady sings.”

You also occasionally hear about an offer that can’t be refused – a Godfather offer – named for Marlon Brando’s menacing character in the 1970s gangster flick.

Should a character in that film have been silly enough not to accept a Don Corleone proposal, he would be rubbed out, whacked, clipped or be found swimming with the fishes.

And one doesn’t have to read the novel by Joseph Heller or seen the Mike Nichols (1970) film to know that “a Catch-22 situation” is a paradox or predicament in which seeming alternatives cancel each other out.

Yet another way for catchphrases to stick around is for them to be appropriated by other programs.

The expression “Cowabunga,” for instance, was coined on The Howdy Doody Show (1947–1960) and featured in Gidget, Sesame Street, The Teenage Mutant Ninja Turtles, and most recently, The Simpsons.

As the Good Book says, there really is nothing new under the sun.

 

A version of this article first appeared in issue 32 of Australian Screen Education.

Non-human characters

It’s awards season once again, but there will be no gongs for the thespians who aren’t people.

Not all the characters we see in movies are flesh and blood.

In 2003’s Japanese Story, for instance, one of the more prominent parts was played by the Australian outback.

Menacing, disquieting, inscrutable, beautiful, the Pilbara was an intrinsic component of the movie.

While actor Toni Collette and the film itself earned praise and awards, it seemed almost unfair not to present the outback with its own Australian Film Institute gong.

This might, of course, create some logistical problems. What would the desert wear to the ceremony? What would it say in its acceptance speech?

(“I’d like to thank God …”).

Likewise, in Lawrence of Arabia, the shifting golden sands of the Middle East form as big part of the film’s fabric as the performance of Peter O’Toole, Omar Sharif or indeed any of the human players.

Ubiquitous, enigmatic and unrelenting, the desert in the film is like a stalking villain that never lets up or gives in; survival requires a constant effort.

That film’s musical score is also such an indelible and recognisable part of the movie that it too is like an unseen character, albeit an important and evocative one.

As the score builds, accompanying a scene where Lawrence leads a caravan of camels across a seemingly unnavigable part of the desert, you can feel your throat becoming increasingly parched. Will they ever reach the other side?

You would expect song to play a key part in musicals – films like Singin’ in the Rain or Oklahoma would hardly make sense without them.

Yet there are other films, too where song plays a key role.

Imagine Saturday Night Fever without the pulsing disco tunes of the Bee Gees. The scene featuring John Travolta strutting along the pavement to the accompaniment of Stayin’ Alive is a defining moment of 70s pop culture.

Although the Coen brothers billed O Brother Where Art Thou as an adaptation of The Odyssey, (even giving Homer a screenwriting credit) it is really a series of comical vignettes that are tied together by an eclectic range of bluegrass tunes.

It is a film with a rather exaggerated, contrived (but humorous) plot. The pleasure to be derived from viewing it comes from its beautiful craftsmanship – the sumptuous cinematography and mise en scene – and the performances from its character actors.

Apart from George Clooney, who plays the excessively loquacious Ulysses Culpepper, I don’t think there’s a more important “character” than the music.

Indeed, it almost seems at times as if scenes were written to accompany the music rather than vice versa.

Many films are set in the urban environment where most of us live, and cities can also sometimes seem like characters, even playing major parts.

New York is as familiar to us as many Hollywood stars.

Its dangerous streets and ornate buildings, its parks and skyscrapers are used so often in film they are in danger of being typecast.

It’s a good thing it is such a versatile performer, sometimes sophisticated and alluring (Manhattan, Wall Street Moonstruck), sometimes dangerous and dishevelled (The French Connection, The King of New York).

The city saddled with playing desperate, depressed or just plain worn-out characters has got to be Detroit.

Once described as the place where the American Dream broke down in the rain and rusted, the Motor City is usually depicted as some kind of hell.

In 2002’s Narc, its grey, drab, joyless streets play a rather malevolent role. The human equivalent might be portrayed by Christopher Walken.

It is a ghoul, a dark-hearted thug from which its denizens are incapable of breaking free.

In Robocop, Detroit is depicted as something of a lawless playground for various pimps, thieves and other assorted criminals ripe for the intervention of the straight-laced cyborg policeman. Yet in 2002’s 8 Mile not even the gritty Motown streets can hold Eminem back from his rhyming’ destiny.

A city doesn’t have to be real to play a character, either.

Consider the metropolis of Dark City in the film of the same name, Gotham in Batman, the hamlet in Sleepy Hollow, or the unnamed city in Se7en.

None are in fact genuine cities, but if anything this only heightens their status as characters rather than mere living spaces. Their shadowy nooks, gothic spires and spooky lanes seem to be inhabited by a nasty sense of foreboding – as if the streets know something the human characters in the films are yet to learn.

The cinema landscape, in fact, features many non-flesh-and-blood characters in a range of guises.

A boat, a building and a computer played important roles in Titanic, Towering Inferno and 2001: A Space Odyssey, respectively.

On the meteorological side, the ocean stole the thunder in Master and Commander: the Far Side of the World, The Perfect Storm, and White Squall.

Backdraft may have featured Robert De Niro, Billy Baldwin and Kurt Russell, but the real star of the film was its many fiery tongues of flame.

In Twister a series of powerful gusts was the headline act.

Of course for these films to achieve any sort of realism, the special effects have to be top shelf.

In the past few years computer-generated effects have come to play such an intrinsic component of filmmaking that in movies such as The Mummy or Matrix Reloaded they could be said to constitute a character all their own.

A character does not have to be corporeal.

In Serendipity, a sense of fate, of destiny, plays perhaps the main character (certainly the title role) in the romantic comedy.

The other side of fortune is bad luck, and in Intacto it is depicted as a palpable object that can be transferred from one person to another, like a disease, or a dread talisman.

One of the main characters in Final Destination is Death. Unlike the hooded Reaper of Bill and Ted’s Bogus Journey, or the spectral figure of the Ingmar Bergman film, it has no human guise.

Rather it is an unseen force that is not assuaged until its intended victims are taken.

And there’s a very big upside to this cast of characters.

They don’t have managers, make unreasonable demands, stay out late partying, want to direct, or ask, “What’s my motivation?” There are no tantrums, no bad days.

They are method actors, and always in character.