“The Facebook Dilemma” Analysis

Most of the more positive implications of Facebook are found at the beginning of part one of “The Facebook Dilemma.”  These benefits are described alongside depictions of the steadfast ambition and “full steam ahead” attitude that Facebook was taking in its early stages.  Both Zuckerberg and his employees would routinely remind each other of this goal of connecting the world, which at the time sounded like an unquestionably admirable goal.  “Move fast and break things” was the motto of the time, as the amplification of scale became the top priority.  This was noted to bring about some beneficial factors.  Employees were integrating and accommodating hundreds of languages, allowing Facebook to function as a platform with an open embrace of cultures and people from all over the world.  There were also features that allowed people to feel more engaged in the daily discourse that took place on the platform.  The like button was a primary example of this, as it allowed users to directly relay their thoughts on a post to the post’s publisher, causing an exchange of expressions to happen on top of the exchange of information.  The global scale and widespread community creation offered by Facebook even led to the toppling of unfavorable regimes.  An example shown in part one of “The Facebook Dilemma” was the revolution that took place in Egypt in 2011, led by Wael Ghonim.  Ghonim kick-started the revolution by establishing a Facebook group that allowed people to share anecdotes and information relating to the mistreatments and shortcomings led by Egypt’s sitting government at the time.  This led to the eventual resignation of President Hosni Mubarak, effectively resulting in regime change.  A Facebook page playing such a role in events like this is evidence of the strength of the platform. 

However, other sections of “The Facebook Dilemma” part one and most of part two refute the idea of Facebook being an overly beneficial idea.  Even two of the benefits cited a little while ago, the like button and the revolution in Egypt, were said to also come with downsides.  While the like button allowed for an increased sense of engagement on the platform, it also made it easier for harmful or inaccurate information to spread across the platform.  Facebook’s algorithm determines favorable posts by like count, since posts that receive lots of likes would be thought of as posts that people like seeing.  However, that means that any post could be promoted by the algorithm if enough users engage with it.  This can lead to Facebook accidentally promoting false information.  An example of this was the “Pope endorses Trump” claim mentioned in part two of “The Facebook Dilemma.”  Despite this being an inaccurate report, it was able to be seen by countless users because it was being widely engaged with.  With Wael Ghonim and the revolution in Egypt, Ghonim seemed to retract his endorsement of Facebook after his newfound notability came at the expense of Facebook users falsely accusing him of being an intelligence operative.  Facebook prioritized user freedoms when it came to engagement and posts, as was mentioned in part one of “The Facebook Dilemma.”  However, those freedoms can serve as double-edged swords when mishandled.   

Part two of “The Facebook Dilemma” addresses the major controversies that have plagued Facebook’s reputation: the US presidential election of 2016 and the Cambridge Analytica scandal.  More specifically, an issue related to the presidential election revolved around the allegations of Russian interference through Facebook.  A discovery someone in part two of “The Facebook Dilemma” noted was that there were politically charged Facebook groups being set up that would encourage more passionate and extreme stances on social and political issues.  These groups would exist for opposite ends of each issue and were artificially amplified in their extremity to sew divide in the American voter base.  This all played out on Facebook, which ultimately dragged them into it as a guilty party as well.  While some of the hostility springing up around this election period may have been manufactured, its ability to take place on the platform still serves as an indictment of Facebook’s structure.  The Cambridge Analytica scandal also acts as a blow to Facebook’s reputation because it demonstrated how data analysis services can maliciously use data on Facebook in ways that were not regulated by the platform.  Cambridge Analytica was gathering user data being posted on Facebook and using it to push political ads onto user feeds.  These ads were tailored to the emotional habits that users demonstrate through their Facebook activity, effectively capitalizing on user behavior.  These tactics were essentially being conducted in the shadows and entirely without user consent.  This was a massive violation of user trust and saw an analytical corporation using patterns in users to tamper with a presidential race.  There were also reports that Cambridge Analytica was taking liberties with how truthful some of the promotional material was, again demonstrating Facebook’s habit of improperly handling the spread of fake news on its platform. 

While both parts of “The Facebook Dilemma” do serve to demonstrate some of the broad-scale implications of Facebook’s shortcomings, the personal impact Facebook and platforms like it have on individual users is not the central focus.  This is where the third video, Dr. Cal Newport’s “Quit social media” speech, comes into play.  In the speech, Newport addresses common arguments made in favor of social media usage and explains his disagreements with them.  One of Newport’s major points is his comparison between social media platforms and slot machines, using the comparison to label social media as an entertainment source.  However, it is the difference he draws between the two that is the most important aspect of his point.  Newport states that the difference is that physical slot machines are location-locked.  Players sit down, use them for a certain period of time, then stand up and leave.  Social media, however, exists primarily on your phone, thus making it accessible at all times.  You can pull the metaphorical slot machine lever repeatedly all day, and that also causes anxiety when you are not pulling it.  Newport notes that people having the constant ability to pull out their phone and look at social media causes stress when users have that ability and are not exercising it.  Another point Newport makes against social media use is that it is not the key to job prospects that people say it is.  Newport claims that it is not truly marketable to be active on social media because being on social media is so easy.  While it may be a way to find connections, the market is looking for demonstrable skills, and being active on social media is not one of them.  Newport employs a more individual-oriented mindset with his speech that differentiates from some of the grander implications made in “The Facebook Dilemma.” 

These videos are extremely valuable in considering how social media has evolved and changed, because a lot of these problems still persist in social media today.  I was never an active Facebook user, so I have no experience with this platform in particular.  I was, however, an active X user for a while, and I stopped using the site months ago for some of the reasons mentioned in these videos.  Content moderation was a major problem in X while I was on it much in the way that “The Facebook Dilemma” described for Facebook.  However, the content I saw on X that went unregulated was extremely harmful and malicious.  I would occasionally encounter very racist and sexist posts on my algorithm despite never engaging in such content, and I would get notifications and encounter posts regarding political issues that were verifiably incorrect.  These posts would have tens of thousands of likes, and yet they remained on the platform with no action taken against them.  The platform carries the same energy that Facebook had in its early years with the “move fast and break things” mentality, and it felt as though it lacked any consideration for the intent and nature of some of its content.  The problems outlined in “The Facebook Dilemma” and Newport’s “Quit social media” speech are still here and still poisoning these digital communities. 


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *