Even before this study it was noticeable just to the naked eye. I remember a time when there was only one YouTube channel as a certified critic, and the barrier to be a critic was pretty high (you...
Even before this study it was noticeable just to the naked eye. I remember a time when there was only one YouTube channel as a certified critic, and the barrier to be a critic was pretty high (you needed a certain amount of watchers/readers/listeners).
After they lowered that threshold it was suddenly easier to get a fresh score. It’s kind of why I switched to only seeing Metacritic when I’m looking for an aggregate since there’s still a level of quality being preserved. Even then MC used to be tougher before they added nerd outlets like IGN. Inception’s score is pretty low but if it came out today it’d get at least an 80.
Per their FAQ... Surely no conflicts of interest here. Movies are great! Buy tickets now!
Per their FAQ...
Rotten Tomatoes is owned by Fandango, which is a joint venture of majority owner NBCUniversal, which is owned by Comcast, and minority owner Warner Bros., which is owned by Warner Bros. Discovery.
Surely no conflicts of interest here. Movies are great! Buy tickets now!
I don't love the site in general, but IMDB's scores have been the most reliable for me. If it's an 8, there's probably something to it that I will enjoy. 7, probably at least decent. 6 and below...
I don't love the site in general, but IMDB's scores have been the most reliable for me. If it's an 8, there's probably something to it that I will enjoy. 7, probably at least decent. 6 and below is a gamble. I don't feel like that has changed over the years, although I've not done any analysis of course. Other scoring systems seem to require more of a nuanced reading.
While all scoring system are flawed and reductive, you have to actually read reviews to get something substantial, but over 8 on IMDB or 4 on Letterboxd is usually a decent indication of a good...
While all scoring system are flawed and reductive, you have to actually read reviews to get something substantial, but over 8 on IMDB or 4 on Letterboxd is usually a decent indication of a good movie. RT is mostly just noise, as so many movies easily go above 80-90% fresh. Of course that is also just down to the nature of how they score, as it is more binary whether the reviewer recommends the film or not. So freshness doesn't have much difference between a decent movie and a masterpiece, as both will end up in a thumps up.
This is interesting, and I don't want to complain, but I can't help myself. Looking at the graph at the bottom of the article, isn't the difference actually much more than 13 percent? It looks to...
This is interesting, and I don't want to complain, but I can't help myself. Looking at the graph at the bottom of the article, isn't the difference actually much more than 13 percent? It looks to me more like 13 percentage points, which would be somewhere around a 25% increase, give or take?
Also, isn't comparing Rotten Tomatoes and Metacritic in this context like comparing apples and tomatoes, or something? One is the percentage of a binary recommendation / no recommendation question, the other is a weighted average of more gradient scores. The two are very different things.
Pretty sure it is. A 5 pp. increase in Metacritic scores could very much mean that a movie crosses the recommendation threshold for ~25% of the critics. Non-paywall link to the source article:...
isn't comparing Rotten Tomatoes and Metacritic in this context like comparing apples and tomatoes
Pretty sure it is. A 5 pp. increase in Metacritic scores could very much mean that a movie crosses the recommendation threshold for ~25% of the critics.
Or any one of their products. Or any review online for that matter. They're all bought. That's why old reddit, and to an extent current reddit, worked so well as users were often independent and...
Or any one of their products. Or any review online for that matter. They're all bought.
That's why old reddit, and to an extent current reddit, worked so well as users were often independent and "the hivemind" counteracted some surface level astroturfing.
Even before this study it was noticeable just to the naked eye. I remember a time when there was only one YouTube channel as a certified critic, and the barrier to be a critic was pretty high (you needed a certain amount of watchers/readers/listeners).
After they lowered that threshold it was suddenly easier to get a fresh score. It’s kind of why I switched to only seeing Metacritic when I’m looking for an aggregate since there’s still a level of quality being preserved. Even then MC used to be tougher before they added nerd outlets like IGN. Inception’s score is pretty low but if it came out today it’d get at least an 80.
Per their FAQ...
Surely no conflicts of interest here. Movies are great! Buy tickets now!
I don't love the site in general, but IMDB's scores have been the most reliable for me. If it's an 8, there's probably something to it that I will enjoy. 7, probably at least decent. 6 and below is a gamble. I don't feel like that has changed over the years, although I've not done any analysis of course. Other scoring systems seem to require more of a nuanced reading.
While all scoring system are flawed and reductive, you have to actually read reviews to get something substantial, but over 8 on IMDB or 4 on Letterboxd is usually a decent indication of a good movie. RT is mostly just noise, as so many movies easily go above 80-90% fresh. Of course that is also just down to the nature of how they score, as it is more binary whether the reviewer recommends the film or not. So freshness doesn't have much difference between a decent movie and a masterpiece, as both will end up in a thumps up.
This is interesting, and I don't want to complain, but I can't help myself. Looking at the graph at the bottom of the article, isn't the difference actually much more than 13 percent? It looks to me more like 13 percentage points, which would be somewhere around a 25% increase, give or take?
Also, isn't comparing Rotten Tomatoes and Metacritic in this context like comparing apples and tomatoes, or something? One is the percentage of a binary recommendation / no recommendation question, the other is a weighted average of more gradient scores. The two are very different things.
Or have I completely misunderstood?
Pretty sure it is. A 5 pp. increase in Metacritic scores could very much mean that a movie crosses the recommendation threshold for ~25% of the critics.
Non-paywall link to the source article:
https://archive.ph/ymRfe
Side note: This isn't a study, but an opinion piece supported by some research, IMO.
On a related note, subtract 1 star off of any video rating on Amazon Prime to get an idea if you want to watch it or not.
Or any one of their products. Or any review online for that matter. They're all bought.
That's why old reddit, and to an extent current reddit, worked so well as users were often independent and "the hivemind" counteracted some surface level astroturfing.