Saturday, August 27, 2016

Ben-Hur: Social Media Monitoring Report

Source: Official Trailer



Ben-Hur: Social Media Monitoring Report
I was recently reading an interesting article The straight-washing of “Ben-Hur”: Remake of the ’59 epic drops gaysubtext—and beefs up religious themes. This inspired me to consider exploring audience feedback about this film on social media during its opening weekend.  The film’s opening weekend was just a week ago (Aug 19-21, 2016) and resulted in a low box office turnout of $11.4million domestic--quite a poor outcome for a film with an approximate $100 million production budget.


Considering Approaches to Collecting Data
In considering ways to explore how the film was being discussed on social media, I initially considered looking at its Facebook page (Ben-Hur Film) or its Twitter feed.  As of this posting, approximately 226,000 people have liked this Facebook page.  I had considered exploring the feedback of audiences to the posted video clips or to the film in general, with a focus on posts during the opening weekend, but this resulted in too large a sample for the purposes of this dialogue.

I also considered looking to Twitter and found there were several hashtags I could explore, including #BenHur, #BenHurMovies, and #BenHur2016.  However, looking at some of these, I was finding a mix of audience response, celebrity encouragements to go see the film, and tweets from film producers, such as Roma Downey (@RealRomaDowney).  This mix of dialogues was going to be challenging to explore—particularly as it might require my further investigation into the tweeter to gauge if they were connected to the film.

Rotten Tomatoes: Site of Examination
Ultimately, I decided to use Rotten Tomatoes,  a film review website that has an active audience participation pool that is separate from their “critic” or reviewers.  Looking at the page for Ben-Hur (2016) we are exposed to preliminary feedback on the film’s response.  As seen on the “Tomatometer” the film was rated at a 28%, which indicated the percentage of approved critics who have given the film a positive review, which is assessed at a 3.5 or higher out of 5 stars.  Meanwhile, we see that 66% of the audience respondents rated the film positively.  While these initial numbers are somewhat informative, presenting information about critical and audience response it does not dig into the opening weekend feedback.  These numbers speak to feedback up to the current time.

Source: Rottentomatoes.com

To explore the audience feedback during opening weekend I looked at the audience reviews posted on Aug 19, Aug 20, and Aug 21, 2016.  It should be noted that these reviews are fairly immediate and does not explore the reviews people posted in the days that followed, who may have seen the film during opening weekend.  The collected reviews from these three days resulted in a sample of 208 reviews (n=208).  Given the variety of review lengths and assessments that could be done, I chose to centrally focus on a quassi-measure of reviewer legitimacy/social engagement, who was posting a review, and their star rating.  

Findings
Firstly, to assess the legitimacy and social media engagement of the reviewers I examined their “profiles.”  “Profiles” are readily visible on the page of reviews in the form of an image and a screen name. Unlike some social media platforms, these profiles are not full profiles in the traditional sense, by in large.  One is unable to click into a person to learn more about them, we are constrained to examinations of their identity based on image and screen name.  As a quassi-measure of likely digital media engagement I collected data on the number of reviewers who included a photo in their profile and those who did not.  I found that nearly ¾ of the audience reviewers included an image, which may suggest a certain familiarity and engagement with both digital technologies and profile construction/meaning.

Secondly, informed by the profile information, I explored the likely gender of the reviewer.  Given the limitations of this project, I centrally focused upon categories of men, women, and unknown.  Reviewers were categorized primarily based upon their screen name, using personal knowledge of name gendering based on a western context.  For a number of profiles it was impossible to determine gender on name alone—in such a case, if a picture was present, it would also be used in the gender determination when appropriate.  This did leave a number of reviewers as unknown.  This demonstrated that among those posting reviews on opening weekend on this website, approximately 60% were men, 25% were women, and 15% were gender unknown (see image for precise statistics).




Finally, I explored the star-rating these reviewers gave to the film.  The ratings ranged from zero to five, with five being the strongest possible review.  Below you may see the breakdown of ratings.  The average number of stars given was 3.5, with a median of 4 stars. Based on this sample and the Rotten Tomatoes rating system, this demonstrated 64% of the audience who wrote a review on this website on the opening weekend liked the film.  This is highly consistent with the current rating, a week later, of 66%.  This said, the numbers are skewed in part due to at least two reviews: one reviewer gave the film zero stars, but had evidently not seen the film based on their written commentary “This won’t be very good. I am not going to watch it.” and another gave the film zero stars, but has extremely positive things to say “I thought the film was awesome!” These two reviews demonstrate some of the problems with this sort of data – some may post feedback/reviews without being personally informed and others may not fully understand the correct manner to assess or evaluate something in a system such as this.




Looking Elsewhere
Another obvious place one might look for film feedback is the Internet Media Database, largely known as IMDb.  On the IMDb website when we look at Ben-Hur we see a 5.5 star out of 10 star score, based on just over 3,800 current reviews. It is interesting that one can actually dig into the reviewer statistics specifically.  Herein, we can find information on reviewer characteristics such as sex, age, age, and U.S./non-U.S. users. For example we can see, that among reviewers whose sex (male or female) is determined, that approximately 80% are male and 20% are female. 


Summarizing Thoughts
Looking at the audience response to this film on this social media site and others offers perspective for various consumers and producers of media, particularly film in this case.  As a consumer, before paying substantial money to see a film in the theater, many may consult reviews from experts or from online reviews in general. Certainly, we may weight expert reviews more heavily on average, but the word of mouth of the audience can make a real difference.  There sure have been films that opened to poor reviews, but became hits with the audiences -- you've probably seen at least some of them!

 
Looking to the reviews for this film on Rotten Tomatoes offers a more positive response to the film than IMDb.  However, the current general reviews on both arguably align with an IMDb overall rating of 5.5 out of 10 stars and a Rotten Tomatoes rating of 66% liking the film.  Indeed, if the Rotten Tomatoes scale is adjusted to a 10 points scale, looking at the opening weekend reviews, it would demonstrate a 7 out of 10 stars.  When examining these numbers we do need to consider the longevity and internet traffic of these respective sites.  IMDb was launched in 1990 and has an Alexa rank of 53.  Meanwhile,  Rotten Tomatoes was launched in 1998, but has a Alexa rank of 552.  Clearly, IMDb is more trafficked and thus likely more respected by the average consumer.  This said, there may different types of audiences using these websites – perhaps Rotten Tomatoes is the “underground” reviewer locale?


Regardless of source comparisons, Rotten Tomatoes demonstrates that the opening weekend’s audience reviews were largely consistent with ratings throughout the week. Despite the various media word of mouth, this film is clearly appealing to a significant number of reviewers.  While not specifically assessed, anecdotally, it appeared that those offering particularly negative reviews were often making comparisons to the classic Ben-Hur (1959). Those with very strong reviews appeared to reference the action of the film as well as its strong religious messages.


Beyond the informing of audiences, these reviews offer insight to film producers and companies as they consider remaking films—especially critically acclaimed films that persist in their reputation.  However, an exploration of domestic versus international reception can also be part of the equation as they consider the financial implications for a film remake such as this – while many Americans may be familiar with the 1959 version, foreign audiences may not be and may receive such a film more openly, without baggage of prior perspective.

No comments:

Post a Comment