The Year of a Bunch of Totally Solid Movies

It’s a sentiment that has been expressed enough to seem worthy of note:

2014 perhaps wasn’t the best year for movies.

When thinking about this year in comparison to other years and in particular last year, which had both Gravity and 12 Years a Slave, should we feel shortchanged?

Well, first off, we need to distinguish two things when deciding our definition of a “good” year for film.

1. The actual quality of films released. Were there fewer good films released this year? How do the “best of the best” stack up against previous years (Oscar contention)? Was there an abundance of bad films? Was it a year of more mediocre movies?

2. The perception of the quality of films released. This takes into account the same variables above, except we need to add an element of how well publicized, distributed, and/or patronized the good movies were compared to the bad movies. Collectively, were bad movies more present than good ones? Were people simply unaware of the good films?

The Best of the Best

In our collective memory years from now, we’ll probably only remember the greats. So it’s worth exploring, independently of all the junk, the most highly-rated films of the year. This won’t tell us overall the quality of film, but it will tell us if this year produced films that will filter into our future canon.

For this analysis, I’ll be using Metacritic as a proxy for quality of film.

Why use Metacritic over Rotten Tomatoes? Basically, Metacritic answers more granularly “how good” the movie was, not just what percent of critics liked it. It also favors more established, well-regarded critics. There’s a long explanation you can check out, and you can also reference Metacritic’s own explanation of how they calculate the score. I’m not saying it’s better; it just tells you something slightly different.

Why use Metacritic instead of anything else? Well, mostly because it already exists, it broadly tracks movies, and we’ve all heard of it, so we have a common reference point.

Let’s just look at the Metacritic scores of the top 10 films by year, regardless of how widely distributed they were.

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014
#1 100 99 96 97 94 99 95 95 100 100
#2 93 98 94 94 94 95 94 94 100 95
#3 92 91 92 92 92 94 90 92 97 95
#4 92 91 92 91 89 92 89 92 97 92
#5 90 90 91 89 89 91 89 90 96 91
#6 89 89 90 86 88 90 88 88 94 91
#7 88 89 89 86 88 90 87 87 94 90
#8 88 89 88 85 87 88 87 87 93 90
#9 87 89 88 85 86 88 87 87 93 89
#10 87 88 88 84 86 88 87 87 92 89
Average 90.6 91.3 90.8 88.9 89.3 91.5 89.3 89.9 95.6 92.2


So looking at the “best of the best,” this year was actually the second highest scoring year in the last 10 years, right behind 2013. That sounds like a pretty strong year for film! 2013 was an exceptionally strong year (the average is 3.4 points higher than 2014). Coming off 2013 probably skews our perspective a little about the quality of 2014.

As a side note, Boyhood has 100 on Metacritic. Both Gravity and 12 Years a Slave were less favorably reviewed at 97.

All the Rest

While it looks like some really great films were released this year, what about everything else? Maybe these best films were drowned out by a sea of terrible cinema. Below we can see the composition of films by Metacritic score each year (best films on the bottom).

Count of Films by Metacritic


We could look at the quality of films at different thresholds. For sake of choosing one, let’s use a threshold of above 70 on Metacritic for a good film (a score of 70 puts a film in the top quartile over the past ten years).

As a side note and reference point, the variability by year isn’t that huge as you can see in the spread chart below (the middle blue section is the middle 50th percentile, the gray bars hit the top and bottom 1%):

image (3)

In fact, 2014 had the most films released above that score, even more than 2013. And the highest percentage of films with >70 score was this year.

# of Films  % of Films
2005 89 24.5%
2006 88 22.2%
2007 94 23.9%
2008 60 17.4%
2009 86 26.2%
2010 76 24.1%
2011 93 25.6%
2012 98 26.6%
2013 100 27.1%
2014 104 28.7%


So 2014 has been the standout year for solidly reviewed films if we look at the top quartile on Metacritic.

Perception of Great Films

Another factor that may be contributing to the sense that 2014 wasn’t the best year for film is the amount of attention the good films got. There are different ways to measure this, but a simple way is to look at the number of theaters films were released in. For reference, here are some films released in 2014 and how many theaters they were released in:

The Amazing Spider-Man 2 4,324
Interstellar 3,561
Million Dollar Arm 3,019
John Wick 2,589
The Quiet Ones 2,027
The Theory of Everything 1,220
The Skeleton Twins 461
Foxcatcher 315
Under the Skin 176
Only Lovers Left Alive 95


For broad public consumption, anything under 1,000 theaters is a fairly limited release. With close to 40,000 theaters in the U.S., it represents only 2.5% of screens. If you live in New York, Los Angeles, and other major cities, you’ll probably have access to those films, but for everyone else, you probably won’t be able to see them at the theater. (Of course, this barrier is being broken down by the advent of VOD.)

But as a rule of thumb, the wider the release, the more marketing, and hence the more public awareness. In the chart below, we can see the composition by distribution over the years for films that were reviewed above 70 on Metacritic.

image (1)


Looking at the dark green bars (that’s releases over 3,000 theaters), 2014 had the highest number of wide releases of solid films (tied with 2005) — many more, in fact, than 2013. Additionally, if you look at the entire bar (>100 theaters), which admittedly includes some really small releases, it’s higher than any other year. So you’d imagine the perception would be that there were a lot of good films available.

Maybe it’s that people didn’t patronize the good movies and thus have a sense that they paid for bad movies.

What if we looked at some measure of how much people spent on good movies versus bad movies?

In a sense, we’re looking at a dollar-weighted Metacritic score by year (total % share of box office revenue x Metacritic score of film).

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014
57.5 56.3 57.5 56.5 55.6 55.9 56.1 59.8 56.3 57.7


The chart shows that for every dollar moviegoers spent, they got a higher Metacritic score than in any other year except 2012.

In fact in 2013, per dollar spent people got a lower quality film than 2005, 2007, 2008, 2012, and 2014.

Circling Back

So why are people saying 2014 wasn’t a great year for film? Well, let’s review what we’ve already discovered:

  • There were more top-quartile films released than any other year
  • There was a higher percentage of top-quartile films released than any other year
  • The average of the top 10 films was only lower than 2013
  • The number of top-quartile films distributed in over 3,000 theaters was higher than in any other year

So what gives? The answer lies more in a combination of breadth of distribution and critical scores. If we look at number of films at incremental Metacritic scores, we notice something. Around the time we change the threshold to 80, this past year starts looking like not such a great year compared to others.

It also sort of explains the perception of 2013 as the standout year, which remains a standout above a threshold of 85.

So you can see that until we hit about 80 on Metacritic, 2014 looks like a pretty good year. Beyond that, it becomes just another year in film. Certainly, there were great movies that came out in 2014. And in fact, broadly there were a lot of really solid movies this year.

But perhaps we came out of the year lacking an overwhelming sense of having been given a selection of extremely high-quality films, because the great films weren’t release widely enough.

Characterizing 2014’s year in film

Aggregately, it was a very strong year, with well-reviewed films released to a wide audience.

Selctively, a small selection of really amazing films were released in limited distribution.

Broadly, not as many amazing films were distributed for enough people to see them, so as an audience, we were left feeling that we didn’t see the best of filmmaking this year.

How Much Is That Oscar Gold Actually Worth?

There’s no doubt that Oscars bring benefits: prestige, vintage Gucci dresses…Ryan Seacrest. But is there actually a financial gain from an Oscar at the box office? I’ve seen several attempts to quantify the value of an Oscar. Largely, the studies tend to agree there is a value to it in box office revenue and they come at it from a few angles.

Problem is that I actually believe there are some weaknesses in the analyses I’ve seen done. These are the ones I’ve identified:

  1. Good movies will make more money. Many of the studies compare Oscar nominees and winners to all other movies. But this is flawed because that’s the equivalent of comparing really good movies to all movies, some good and others not. It doesn’t prove Oscar nominees make more money, only that good movies make more money.
  2. Marketing effects aren’t included. Studies don’t take into account the additional marketing costs that studios incur to promote Oscar movies. It’s hard to attribute financial value to the actual Oscar or the Oscar campaign.
  3. Pre and post theatrical revenues are misleading. Most Oscar nominees are released in the last quarter, which means they’re still in their theatrical run during the nomination process. Studies that compare the value of money coming in after the Oscar announcement doesn’t accurately reflect what it would have been otherwise.

Okay, so I’ll start off by saying, it’s pretty difficult to eliminate numero dos. Determining how much money is being pumped into a marketing campaign during awards season is a guess at best. But let’s proceed from here anyway. This analysis will look at the effect of a Best Picture Oscar.

Keeping Quality Constant

The first thing I wanted to do was hold quality of film constant (numero uno). The way I did this was to look at the Oscar nominees for the year and select a peer group of films from that year which met two criteria:

  1. Rotten Tomatoes score was at least as high as the lowest rated nominee (reviewed as positively).
  2. Number of critical reviews on Rotten Tomatoes was a large as the nominee with the fewest reviews (reviewed as broadly).

Not comparing pre and post revenue and only looking at total revenue should also eliminate numero tres. I plotted those movies on a scatter plot and it looked like this:

chart_1 (1)

The size of the bubble represents the box office revenue. You can see that on average nominees aren’t necessarily much bigger than un-nominated films in the peer group (in fact, the winner was relatively average on the graph).

Further, when I looked at all peer groups from 2000-2013, the average revenue for the un-nominated peer group was $257.5 million compared to $257.1 million. That pretty much tells us there’s no difference between a “good movie” and a “good movie that gets nominated for an Oscar,” right? Well, not so fast.

Taking Into Account Budgets

Problem with this analysis is that films vary in size. 12 Years a Slave and The Avengers aren’t really in the same movie ballpark: different distribution, different budgets, different audiences. So I wanted to figure out a way to take that into account, so I did the same analysis but I didn’t look at raw dollars grossed and instead return on investment (% box office revenue divided by production budget).

The new graph looked like this:

chart_1 (2)

A little more compelling, I guess — the blue circles are starting to look bigger on average. From the graph alone, I’d say it’s inconclusive, but when I looked at the peer groups (starting in 2000), in fact, on average Oscar nominees have returned 6.6x compared to 4.9x for un-nominated movies. So over the longer term, it seems that Oscar nominations do help on a percentage return basis.

Breakdown by Budget Sizes

This got me thinking more about big budget films and Oscar nominations, so I looked at ROI by budget size among the different groups:

No Nom Nom Win Overall Number of Films
($1mm,$5mm] 840% 1650% N/A 1245% 10
($5mm,$10mm] 709% 915% 1385% 799% 28
($10mm,$20mm] 691% 853% 1548% 851% 37
($20mm,$50mm] 382% 528% 578% 446% 73
($50mm,$100mm] 388% 391% 648% 404% 50
($100mm,$200mm] 449% 398% 444% 434% 38
>$200mm 411% 593% N/A 451% 9
Total 493% 648% 987% 570% 245

It looks like movies above $100 million don’t necessarily gain financially from an Oscar nomination or win. If we take a step back, that makes sense. Those movies have been marketed on the side of every bus from here to Kansas City. They’ve probably hit their demographic already. An Oscar nomination isn’t going to convince that many more people to watch them.

Looking at the numbers though, it’s pretty clear that low and mid-size movies tend to gain a significant increase, especially below $50 million budgets. Oscars help tiny films, and that makes sense for the converse reasons that they don’t help big budget movies.

Even though I couldn’t remove numero dos from the equation, I think I answered another question:

Does it make sense to put that money into Oscar marketing?

Well, it seems if you’re a big budget film, probably not. You’ve probably already marketed the shit out of it, and you’re probably looking at diminishing returns on the marketing spend. But if you’re in the under $50 million budget arena, yeah, market the hell out of it for Oscar season. Get that win.

This is likely why distributors like The Weinstein Company place so much emphasis on the Oscars. Their films fall into the lower/mid budget range and get significantly more bump from Oscar nominations than movies like Avatar, for instance.

So how much is that gold actually worth?

So there isn’t strictly speaking a dollar value that it brings. But at its highest, it can bump your movie from 7x to 15x return, which on a $10 million movie is an additional $80 million. That ain’t chump change!

The Nominees + Some Wild Speculation

So here’s the breakdown of nominees. Who should actually put money into getting the award? What do they each stand to gain? If we just look by budget and global gross to date, we can get an idea.

The Grand Budapest Hotel (Fox Searchlight)

Gross: $174,600,318; Budget: $27,000,000

A moderately priced movie, it’s already done about as well as movies in a similar position years past. I’d guess this may get a marginal bump. A win would probably give it an extra $15 million though.

Bottom Line: Worth keeping release active, but probably not worth spending too much campaigning. It may hit $200 million if it extracts everything out of the nomination.

The Imitation Game (Weinstein Company)

Gross: $83,660,241; Budget: $14,000,000

A moderately priced movie, the nomination will likely give it another $25 million. A win would probably give it an extra $15 million on top of that.

Bottom Line: Worth keeping release active or even going wider, but depending on how much Weinstein thinks it’s positioned to win best picture on merit, probably not worth spending too much campaigning. It should break $100 million.

Birdman (Fox Searchlight)

Gross: $34,492,455; Budget: $18,000,000

A moderately priced movie, the nomination should bring it to 8x, let’s say around $150 million. A win would mean 15x multiple, so around $300 million.

Bottom Line: Definitely worth going wider. It should reach an 8x multiple and is only at 2x. Probably worth spending money on campaigning and advertising a continued run. If it gets the win, it should mean a huge payoff. It should get to $180 million globally if it performs in line with past nominees.

The Theory of Everything (Focus Features)

Gross: $46,502,405; Budget: $15,000,000

In the same range as Birdman, it should gross around 8x, so let’s say around $120 million. Though it’s unlikely to win, it’s worth going wider to extract that extra nomination value.

Bottom Line: Definitely worth keeping release active or going wider. It should reach an 8x multiple and is only at 4x. Probably worth spending some money advertising a continued run alongside the nomination. It could get to $120 million if it performs in line with past nominees.

Boyhood (IFC)

Gross: $43,482,423; Budget: $4,000,000

A cheap movie, the nomination alone should give it another $25 million. This may be the first film to win gold with a budget under $5 million. With the right distribution around the awards ceremony, it probably stands to gain from a win.

Bottom Line: Definitely worth keeping release active or going wider. It should reach a 16x multiple with the nomination alone and is only at 10x. Definitely seems worth spending money on campaigning and pushing a wide release run. I’d guess it’d get to $80 million. But who knows? It’s a wild card since no film this small has won Best Picture in recent memory.

Selma (Paramount)

Gross: $16,548,467; Budget: $20,000,000

A mid-priced movie, the nomination alone should give it a $20 million boost. A win would only marginally give it an extra $10 million. By the end of its run, it should gross about $100 million if its in line with previous years.

Bottom Line: Still in its run, the Oscar campaign will probably overlap with the existing marketing campaign. It should give good reason to go international. When all is said and done, it should get to $100 million if it performs in line with previous years.

Whiplash (Sony Classics)

Gross: $7,055,092; Budget: $3,300,000

An extremely cheap movie, the nomination alone is likely to bring it to a 16x multiple. A win is probably an extreme long shot, but a nomination is reason enough to go for a wide release.

Bottom Line: Definitely worth going wide (and global since it’s barely made any internationally) with it and putting some money into a campaign. It stands to gross around $50 million if it’s in line with past performers.

American Sniper (Warner Bros.)

Gross: $17,972,722; Budget: $60,000,000

A fairly expensive movie, but still early in its run. The nomination alone isn’t gonna give it a boost in all likelihood, but if it wins it could give it a 50% bump. When all is said and done, the film should gross around $250 million if it’s in line with other past nominees.

Bottom Line: Still in its run, the campaign for Oscar will probably overlap with its existing marketing campaign. Should hit $250 million if it performs in line with past years.

Black List events – in LA and beyond!

Hey guys, I’m Megan – Director of Events here at the Black List. An East Coast native, I moved to LA last spring after 10 years in NYC… so in other words, you don’t want to drive behind me on the freeway (unless you’re a grandma – in that case, we’ll get along great).

I run all of the Black List’s events, so if you haven’t met me out in LA already, you probably at least get my emails. We have a lot of great things in the works for 2015 so keep an eye out for our event schedule (coming soon!). We did just announce our kick off happy hour on February 4th at Melrose Umbrella Company. Come and say hello to me and the team! Details are here. Come prepared to talk Academy Awards because we’ll be hard at work on our Oscar pool…

And while you’re here, check out our new events page! It’s where we’ll be announcing all upcoming events and posting updates and photos from past ones. If you missed our 2014 live reads, not to worry – we’ll be announcing our next live read soon, so get excited. It’s going to be a good one! And for all you non-Angelenos, not to worry. We have some great programming in the works all over the country. We might be in your city sooner than you think!

Turning Words Into Numbers

Finally, my Excel skills are coming in handy!

Terry here. Among other things, I do data stuff at the Black List, so I’m gonna start adding fun posts using data (I’ll tag them under Words & Numbers). I’m pulling whatever data I can find (some stuff we have, some stuff that’s just floating out there) to get a better understanding of everything film and TV related. I’ll continue to search the internet for whatever I can find and create fun charts that everyone will love!

Like bar charts!


And pie charts!


And scatter plots!


I’ll take requests, so if you have ideas for movie and TV data studies, post some suggestions! Also feel free to recommend data sets that might be worth exploring. I’ll write up whatever I find from the data. And I should say this: I’ll publish what I find, but these won’t be academic studies, so excuse any errors or misattributions!

Since it’s Oscar season, my next post will be about the monetary value of an Oscar at the box office. Keep an eye out!