Posted

in

, ,

by

Tags:


Since my site began, I have tried to cover every precursor there is from the guilds to the critics. This hasn’t always been easy as new groups are founded all the time that I don’t hear about until years after they’ve started giving out awards.

Currently, there are a total of 54 groups within the U.S., two from the United Kingdom, and two from Canada that give out awards that in one way or another have an impact on the Oscars. London and BAFTA use similar release windows and Canada receives almost identical release patterns for U.S. Releases. That’s 58 groups that can impact the Oscars. This number continues to increase each year with the youngest group, the Atlanta Film Critics, starting out just this year. Back when I started my site, there were only 29 active groups, making today’s total exactly double what it was in 1996.

Best Picture has by far the best representation as it’s the one category that every group gives out and two guilds typically predict (these are the Screen Actors Guild Cast award, which has debatable and frequently dubious connection to Best Picture, but I use it anyway). All of the rest are represented less evenly. 44 groups recognize Director, and the four acting categories, while 43 give out documentary prizes. Forty give out awards for foreign language films and also some form of screenwriting award.

On the low end of selections, only one group (BAFTA) gives out a live-action short film award, two give out animated short film awards (BAFTA and the Annies), 2 give out a Sound Editing prize (MPSE & OFTA), four honor work in Sound Mixing and Makeup, and Costume Design is represented by nine groups, the final category prized by fewer than ten groups.

Looking over these groups is sometimes instructive, especially when you look at comparative results. While going into statistical models and other information requires a great level of detail, it’s always interesting to look at basic statistics, such as which Oscar winners won the most prizes or which groups had the best predictive capability. That’s what this article will look at. My data only goes back about twenty years so far, but I continue working on inputting new information. As such, here are some interesting figures for this year.

Most Honored Oscar Winners (large representation categories)

Casey Affleck as Best Actor for Manchester by the Sea won 86.36% of categories (38 out of 44)
La La Land for Best Original Score won 68.97% of categories (20 out of 29)
Mahershala Ali as Best Supporting Actor for Moonlight won 68.18% of categories (30 out of 44)
La La Land for Best Cinematography won 60.61% of categories (20 out of 33)
Viola Davis as Best Supporting Actress for Fences won 54.55% of categories (24 out of 44)
O.J.: Made in America for Best Documentary Feature won 53.49% of categories (23 out of 43)
Damien Chazelle as Best Director for La La Land won 52.27% of categories (23 out of 44)

Least Honored Oscar Winners (large representation categories)

The Salesman for Best Foreign Language Film won 5% of categories (2 out of 40)
Hacksaw Ridge for Best Film Editing won 15% of categories (3 out of 20)
Emma Stone as Best Actress for La La Land won 15.91% of categories (7 out of 44)
Moonlight for Best Adapted Screenplay won 34.48% of categories (10 out of 29)
Moonlight for Best Picture won 37.78% of categories (17 out of 45)

Most Honored Oscar Winners (small representation categories)

“City of Stars” as Best Original Song for La La Land won 66.67% of categories (8 out of 12)
La La Land for Best Production Design won 52.94% of categories (9 out of 17)

Least Honored Oscar Winners (small representation categories)

Arrival for Best Sound Editing won 0% of categories (0 out of 2)
Fantastic Beasts for Best Costume Design won 22.22% of categories (2 out of 9)
Hacksaw Ridge for Best Sound Mixing won 25% of categories (1 out of 4)
Suicide Squad for Best Makeup won 25% of categories (1 out of 4)

2016 Best Predictors

Guilds:
Producers Guild – 66.67%
Annie Awards – 100% (includes 2 categories combined for 1 comparison)
Directors Guild – 100%
Art Directors Guild – 100% (includes 3 categories combined for 1 comparison)
Make Up Artists Guild – 100% (includes 5 categories combined for 1 comparison)
Visual Effects Society – 100% (includes 3 categories combined for 1 comparison)

Critics:
Southeastern Critics – 75% (9 out of 12 categories)
Georgia Critics – 73.33% (11 out of 15 categories)
Dallas-Fort Worth Critics – 66.67% (8 out of 12 categories)
Indiana Critics – 66.67% (8 out of 12 categories)
Iowa Critics – 66.67% (6 out of 9 categories)

Other Groups:
USC Scripter – 100% (1 out of 1 category)
Golden Globes – 72.73% (8 out of 11 categories; includes 6 categories combined for 3 comparisons)
Online Film & TV Assoc – 52.38% (11 out of 21 categories)

2016 Worst Predictors

Guilds:
Writers Guild – 0%
Cinematographers – 0%
Costume Designers Guild – 0% (includes 3 categories combined for 1 comparison)
Cinema Audio Society – 0% (includes 3 categories combined for 1 comparison)
Sound Editors – 0% (includes 6 categories combined for 1 comparison)

Critics:
San Diego Critics – 12.5% (2 out of 16 categories)
Los Angeles Critics – 21.43% (3 out of 14 categories)
San Francisco Critics – 26.67% (4 out of 15 categories)
Florida Critics – 33.3% (5 out of 15 categories)
London Critics – 33.3% (3 out of 9 categories; does not have 100% match in eligibility periods)
Vancouver Critics – 33.33% (3 out of 9 categories)

Other Groups:
Grammy – 0% (0 out of 3 categories, seldom matches due to 3-month gap in comparative eligibility period)
Spirit Awards – 27.2% (3 out of 11 categories)
Satellite Awards – 36.84% (7 out of 19 categories)
British Academy – 42.86% (9 out of 21 categories; does not have 100% match in eligibility periods)

Special Honors

Casey Affleck’s 86.36% success rate is among the highest recorded in the last 21 years. It is ranked 12th.

In making this comparison, I ignored any category that wasn’t given out by at least 50% (rounded down) of the groups, with guilds counting together as 1. Note that the screenplay categories aren’t always broken down into original and adapted forms. If a film won the adapted prizes from one group, I would list it as a win for Adapted Screenplay and vice versa. This list contains all Oscar winners that took at least 70% of their category’s respective prizes.

Some categories are over-represented. Animated Feature counts for a disproportionate number of these records. Some year’s saw no victor earn 70% or more precursor prizes. 2011 and 1998 were the only years that blanked. The Social Network remains the only Oscar winner with a 100% match rate.

Here are those winning winners in order from highest % of agreement to lowest. At some point, once I finalize the data collection, I’ll look beyond 1996 as well as look at the precursor winners most ignored by Oscar.

100%
The Social Network won 100% of 2010’s Adapted Screenplay prizes

90+%
The Incredibles won 94.44% of 2004’s Animated Feature prizes
L.A. Confidential won 94.44% of 1997’s Adapted Screenplay prizes
WALL-E won 93.33% of 2008’s Animated Feature prizes
Man on Wire won 93.10% of 2008’s Documentary prizes
Christoph Waltz won 92.11% of 2009’s Supporting Actor prizes
Helen Mirren won 91.18% of 2006’s Actress prizes

80+%
Eternal Sunshine of the Spotless Mind won 88.89% of 2004’s Original Screenplay prizes
Toy Story 3 won 88.24% of 2010’s Animated Feature prizes
Inside Out won 87.18% of 2015’s Animated Feature prizes
Ratatouille won 86.67% of 2007’s Animated Feature prizes
Casey Affleck won 86.36% of 2016’s Actor prizes
Sideways won 86.36% of 2004’s Adapted Screenplay prizes
Heath Ledger won 85.71% of 2008’s Supporting Actor prizes
Shrek won 84.62% of 2001’s Animated Feature prizes
Mo’Nique won 81.58% of 2009’s Supporting Actress prizes
J.K. Simmons won 81.40% of 2014’s Supporting Actor prizes
Gravity won 80.65% of 2013’s Cinematography prizes
Spirited Away won 80% of 2002’s Animated Feature prizes

70+%
Forest Whitaker won 79.41% of 2006’s Actor prizes
Philip Seymour Hoffman won 78.13% of 2005’s Actor prizes
Javier Bardem won 77.14% of 2007’s Supporting Actor prizes
No Country for Old Men won 75% of 2007’s Adapted Screenplay prizes
An Inconvenient Truth won 75% of 2006’s Documentary prizes
Juno won 74.07% of 2007’s Original Screenplay prizes
Kathryn Bigelow won 73.68% of 2009’s Director prizes
Crouching Tiger, Hidden Dragon won 73.68% of 2000’s Foreign Language Film prizes
Up won 72.73% of 2009’s Animated Feature prizes
Daniel Day-Lewis won 72.22% of 2007’s Actor prizes
Finding Nemo won 72.22% of 2003’s Animated Feature prizes
Slumdog Millionaire won 72% of 2008’s Adapted Screenplay prizes
Daniel Day-Lewis won 71.43% of 2012’s Actor prizes
No Country for Old Men won 70.27% of 2007’s Picture prizes
The English Patient won 70% of 1996’s Cinematography prizes

Verified by MonsterInsights