Filmmakers don’t always get their own way.
Having helmed such critically and commercial successes as Thelma and Louise, Black Hawk Down and Alien, it seems remarkable now that director Ridley Scott didn’t have things all his own way with Blade Runner (1982).
After all, we are talking about an artist who has earned a formidable reputation as a creator of big-canvas filmic epics such as Gladiator. And it’s a given he now has the prestige and the power to hold sway on any project he undertakes.
Yet early in his career while he was still establishing his name in feature film directing, Scott reluctantly made changes to a film many regard as one of the finest of the 1980s.
Even today, 30 years after it was made, the film’s look and tone depicting a dystopian future in a dark, menacing and polyglot LA, still seems futuristic (and bleak).
After the film was shown to test audiences, however, the producers of the project were unsatisfied with the manner in which it was received.
Too confusing, test audiences groused.
The end result was that a couple of changes were ordered.
The original ending, for instance, didn’t gel with Scott’s vision, a key scene was omitted while the voiceover supplied by Deckert (Harrison Ford) was also not to Scott’s liking. The film was released, met with a mediocre response and subsequently earned wider appreciation in arthouses and upon its release to video.
It was only much later, after Scott had underscored his considerable reputation as both a stylist and maker of commercially successfully movies that he was able to finally release Blade Runner in the manner he originally intended for it to be seen.
The director’s cut, as the name implies, is a version of the film the director, rather than the studio, the producers or anyone else, wished it to be made.
Of course, long before Blade Runner Director’s Cut was released, various versions of films were available.
Sometimes different version of a films are released in different countries to meet certain rating standards, scenes are added or lost for television and so on. For instance, the Australian version of On Any Given Sunday had about 10 less minutes of gridiron footage than the US version.
And by 1992, a full 10 years had passed since Blade Runner had originally been released, time for a whole new audience to have grown up, ready to join admirers of the original.
So as much as Scott’s new cut represented the unveiling of an artist’s uncorrupted vision, it was doubtless a whole new revenue stream for a product that didn’t reach its potential upon release.
That would appear to be the category in which most directors’ cuts fall. Consider the example of Steven Spielberg’s Close Encounters Special Edition, James Cameron’s director’s cuts of Aliens, The Abyss and Terminator 2: Judgement Day and the anniversary editions of the George Lucas’ original Star Wars trilogy.
It’s hard to imagine these titans of film not getting their way when their movies were first made. Rather, the cuts of their films are akin to an “added extras” version of a car.
Sure viewers get a longer movie, but is that necessarily better? Is more more?
Often, sadly, no.
Take the example of T2: Judgement Day, Director’s Cut. Quite a few scenes run longer and several new ones have been woven into the fabric of the film. True, some add to our understanding of the characters. The downside is a compelling film is slowed down somewhat.
The same could be said for the Director’s cut of Aliens, which seems lugubrious when compared to the viscerally exciting original. We gain little from the additions and those gains (for instance, learning Ripley had a daughter who died) are offset by the deleterious effect on pacing.
The latter cut of Blade Runner, however, is imbued with a whole different complexion following the changes made. Though the differences between the two versions are quite subtle, clearly we’re talking about two quite different films.
Doubtless such would also be the case in those cut made by directors who were sacked from films they were working on – that is, if they ever had the chance to make them.
One interesting example might be a director’s cut of American History X prepared by the controversial Tony Kaye.
Kaye’s original vision for the film was wildly different from that of the producers and star Edward Norton, whom Kaye later referred to as “lice.”
In an artistic battle fought out in the Hollywood press, the studio claimed Kaye’s original cut was too short. Kaye countered by spending more than $US one million editing the film and taking out advertisements outlining his stand.
It didn’t wash, and his relationship with Norton deteriorated.
“Norton’s ego and narcissism kind of manoeuvred it (the structure of the film) totally to his wants, really,” Kaye says. “Not so much through the shooting, because I got everything that I needed to get, but when it came to the editing process, he manoeuvred himself into the cutting room and really caused me considerable grief.”
So disenchanted did he become with the editing process that Kaye wanted to remove his name from the credits and replace it with Alan Smithee, the sobriquet customarily used when directors no longer want to be associated with a project.
When that wasn’t allowed, Kaye tried to get his name replaced with that of Humpty Dumpty, also to no avail. For Kaye, this act summarised his relationship with the producers, with Norton and with Hollywood.
“Actually when I pulled the word Humpty Dumpty out of the air, I didn’t realise that Humpty Dumpty is basically a metaphor for mankind,” Kaye says. “And to me, that’s not far away from this whole scenario, because truth and honesty and integrity and respect are not words that any of these people live by. And I think Hollywood right now, maybe it’s always been like this, but it’s really lost a sense of what reality is.
“And I believe that when you make a film or when you put a story in pictures and sound on a screen in a theatre, it has to be real. And if the filmmakers have lost the notion of what reality is and authenticity is, then that work can never ever be good. Because they’ve lost the intuitive sense of how to judge the work.”
Final say in the editing process is obviously the most important factor in determining how closely the film that’s released resembles what the director originally had in mind.
Historically it wasn’t unusual for the bean counters on a project to insist on changes, and Orson Welles was one auteur who constantly fought (and lost) artistic battles over his projects.
Even as long ago as when Welles was making films, a potentially powerful element was already part of the editing process: that of the test audience.
Australian director Richard Franklin (FX2, Psycho 2, Hotel Sorrento) describes Welles’ The Magnificent Ambersons as “probably the major casualty of audience testing in film history.”
(See Franklin’s seminal article, Cinema Papers 95, October 1993).
One test audience member wrote on their card, “As bad if not worse than Citizen Kane.”
Another wrote, “Audiences want a laff,” and this response was given three times the weighting of another comment (saying) “possibly the greatest piece of cinema ever.”
A test audience wanted the song Over the Rainbow cut from The Wizard of Oz (1939) because it was considered the scene it appeared in slowed down the film too much.
Supposedly a group selected to represent a film’s target demographic, a test audience can give a serious high five or thumbs down to characters, plot elements, music, or denouement of a movie.
A test screening audience didn’t like that Samuel L. Jackson’s’ sartorially resplendent character in Renny Harlin’s The Long Kiss Goodnight was rubbed out before the final credits rolled.
So even though earlier scenes had shown Jackson taking enough lead to kill his character several times over, the actors were ordered back to the set and new scenes showing Jackson’s character bravely surviving were shot.
A test audiences was also responsible for having Glenn Close’s character in Fatal Attraction killed off.
In essence, test audiences members are no longer regarded as people who get to experience a finished piece of art, but rather consumers, who like diners in a restaurant, are able to send back what has been served to them if it’s not to their liking.
In Franklin’s opinion, studios using test audience results to ride roughshod over directors “is about power and not about art.”
He considers allowing people who don’t understand the movie-making process to give their opinion on how a picture might be changed is like asking folks in off the street to try some amateur brain surgery.
On the other hand, Franklin is in favour of using previewing that has meaning to him.
“That can be as simple as showing it to two or three trusted friends or as complex as screening it for an audience of 50 people in which half are known to me and half are not,” explains Franklin, who has even been known to stop the projector and ask questions during a screening.
“But never do I use the anonymous process. That is, someone has to give me a reason why someone won’t put their name on the form. If I don’t understand their comment or I think they’ve got a point, I actually follow them up and talk to them.”
Would Franklin make changes to his films based on comments from audience members?
“Of course, why would you have a screening otherwise?” Franklin says. “You don’t so it to make changes, you do it to learn about how your ideas are communicating. And sometimes you can achieve that best by just showing it to one friend.”
Australian director Phillip Noyce (The Bone Collector, Patriot Games, Clear and Present Danger) is one director not necessarily in favour of the testing process as it used in the US.
“In the past 10 years, the studios used the test screening process as a baseball bat to knock the filmmakers over the head and beat them into homogeneity,” Noyce told The Age.
“Now it’s arguable, at least until recently, that a few hundred teenagers in the Valley in Los Angeles have as much power as the studio heads do in terms of finally affecting the movies that actually appear on screen.
“The test-screening process in America has been taken to the nth degree – and that’s n for nitwit.”
For directors without control over the final cut – and in a very competitive business it would seem there are many who fall into that category – the testing process could loom as a potentially intrusive element in the creative process.
Sure, filmmakers must be accustomed to their art form being a collaborative one, with an often eclectic group contributing to the end product. But when a test audience via a studio demands changes be made, it could hardly be considered a constructive element. Not for the director, anyway.
“It’s much easier to embrace the whole testing process when you know that you ultimately control the final cut on your movies,” director Ron Howard (Parenthood, Backdraft) once explained.
“But it’s frightening if you’re in a position where you’re going to show the movie at a preview and somebody else is going to take the results of that preview and re-cut the film based on that, maybe consulting you or maybe not. That’s terrifying.”
It’s easy to understand why a studio would want to control as much of the filmmaking process as possible, of course. So much money is invested in major Hollywood productions that the failure of even a single big-budget feature can have a devastating impact on a studio’s bottom line.
Little wonder, then, that every tool at a studio’s disposal is utilised to make a film “work”.
Would audiences have flocked to Pretty Woman in the numbers they did had Richard Gere and Julia Roberts parted at the end of the film before a test audience effectively changed the conclusion? We’ll never know.
On the other hand, countless changes to films have been made at the behest of test audiences that we’ll also likely never know about, for better or worse.
So is there some way to avoid Noyce’s description of the Hollywood process, where the testing system has an unhealthy presence?
How about director’s cuts for every director unhappy with the version that hits our screens?
Didn’t think so.
Perhaps the emergence of DVDs will go some way to helping viewers understand the difficulty directors (and studios) face in editing films.
When in 1996 final cut was taken away from Paul Thomas Anderson’s Hard Eight (aka Sydney) and the film re-edited, the director was a virtual unknown.
But having since enjoyed a measure of critical and commercial success with Boogie Nights and Magnolia, a director’s cut DVD of Hard Eight is now available, showing the film as it was originally meant to be seen.
Yet perhaps the onus for taking clout away from those California teenagers Noyce was referring to belongs to us.
Every time we stay away from a piece of formulaic, derivative dross, and on every occasion we embrace those films that defy convention, mess with the standard template and resist categorisation, we have a say in the type of films that will be made in the future.
This article first appeared in issue no. 127/128 Autumn/winter 2001 of Metro magazine.
You don’t hear so much about either audience testing or director’s cuts of movies these days. Perhaps Hollywood has become so risk-averse that the kinds of films that might elicit an unfavourable response just can’t be made inside the studio system anymore. Ridley Scott continued to fiddle-faddle around with Blade Runner until he produced The Final Cut in 2007. There is talk of a sequel in 2017 starring Harrison Ford and Ryan Gosling.