2014 Oscar Preview: Precursor Winners & Losers, FINAL

It’s the end of Oscar season and the awards have all been handed out. There isn’t much left to do, but analyze just how accurate the precursors were. Here are the big winners & losers. This looks at only their winner selections. Comparing nominees to the Oscars would be far more challenging and time consuming. There really isn’t much commentary for these, so I’ll just highlight the rankings.

Big Winners

The Guilds Once again, the Directors Guild and Cinematographers were the most accurate, correctly predicting their sole categories. The Art Directors and Costume Designers did equally well, though they had three categories available to project winners in. The Make-Up and Hair Stylists Guild was also quite accurate and the Screen Actors Guild went four-for-five.
Critics The best performing critics group came in with a lowly 58.33% accuracy rate. The Dallas Fort-Worth Film Critics Association bested New York’s Online critics (54.55%) and the Las Vegas critics (52.94%)
Other Groups USC Scripter was once again the sole perfect score among other groups, but they only have one category. The Spirit Awards, with the more frequent overlaps of recent years, came in second with 63.64% accuracy while the British Academy Awards predicted 13 of 21 categories for a 61.90% success rate.

Big Mehs

The Guilds The Writers Guild of America, thanks to their ineligibility ruling on Birdman; the Motion Picture Sound Editors by having two unmatched categories; and the Annie Awards with an even split, were the only mediocre results, each taking 50% of their potential comparative predictions.
Critics Voting before trends emerge, the critics don’t always jump on the right boat and this time, they mostly lined up for Michael Keaton and Boyhood, both big losers at the Oscars. That said, apart from the winners listed above, the St. Louis (48.28%), Nevada (46.15%), Houston (46.15%), Kansas City (45.45%), New York (41.67%), National Society (40%) and Georgia (40%) critics were the only ones that seemed to line-up better than the worst of their compatriots.
Other Groups This year, either you did well among other groups or you didn’t. This year, all of the miscellaneous groups fell into the winners or losers brackets.

Big Losers

The Guilds The Visual Effects Society, Producers Guild of America, Cinema Audio Society and American Cinema Editors were the big losers this year, correctly prediction nothing at the Oscars. The CAS only had one category with which to do so, and the editors only had one that ever has any correlation. The Visual Effects Society could have matched two categories to the Best Visual Effects category, but didn’t get either while the Producers Guild of America had three categories that match up with the Oscars and not a single one carried over for a victory.
Critics It’s best to talk about the biggest losers here rather than all the critics that fall into this category as a general rule. The Online Film Critics Society had the dubious distinction of failing to predict more than one winner, Best Supporting Actress Patricia Arquette. They had even gone against the J.K. Simmons love fest and voted Edward Norton their award. This gives them the worst average of the year with a terrible 7.69%. After years of strong overlap between their membership’s votes and the Oscars, this seems to be an unfortunate road bump. The National Board of Review only matched one winner, but did so in fewer categories for a 8.33% match. 8.33% was also what the Indiana Film Journalists Association ended up with. Rounding out the bottom five were the Southeast Film Critics with 18.18% with two correct predictions; and Toronto and San Diego tying with 20% each with two and three correct predictions respectively. Also of note is the Broadcast Film Critics Association who give out the televised Critics Choice Awards. They often extol their Oscar accuracy, but ended up matching 7 of 19 categories, which is a paltry total: 36.84%
Other Groups The worst group this time out was the Golden Globes with only five of fifteen predictions carrying over. It’s a little harsh since they have comedy and drama categories for three of those and if you took those three overlaps out, they jump into the “Meh” category. As such, we go to the next group up and find the Online Film & TV Association matching only 7 of 21 categories for a 33.3% overlap. The Satellite Awards weren’t much better with 35%.

2 Comments

Add a Comment
  1. Didn’t SAG go 5 for 5?

    1. I tabulated in a different format this year than I have before, so I’ll have to re-do my data since it seems I copied a winner down wrong somewhere along the way.

This site uses Akismet to reduce spam. Learn how your comment data is processed.