THE
PERSPECTIVE
PROJECT

Considering Content Personalization
in a Digital Age

THE PERSPECTIVE PROJECT

Content Personalization in a Digital Age

How do we know what we know?

In a modern world overflowing with information, how do we determine what is truth, and what we believe? Google allows us to search for any query and in return delivers a plethora of internet results in milliseconds, which vary in accuracy and often organize results based on location or preference. Facebook connects people across the globe, and yet has the capability of limiting content to only what we want to see. Philosophies such as hermeneutics and phenomenology would assert that our experience forms not only our beliefs, but also our reality and understanding of the world. These ideas, developed in a pre-digital world, become even more significant with the advent of new technology. The internet introduces a platform of experiences that did not exist before, where online marketing tools can manipulate information to present the most appealing content to a user, affecting the way we perceive things.

How might the internet change the way we see ourselves and others?

When we were born into this world, we were initially communicated with through sound and body language. These tangible interactions and the way we experienced life formed our identities, which significantly impacts our perception. According to ideas expressed by the French philosophers Jacques Derrida and Michel Foucault, “Different identities are formed mainly through social interactions and shared histories; that is, they are learned within certain cultural and political settings, rather than being set at birth… no one is born with a unified, inevitable identity; rather, a person’s identity is a product of, and in concert with, human culture…” (p. 51, Robertson). The way we establish our identities provides us with a set of beliefs about the world, which is founded in our experiences. Our beliefs provide a framework through which we process new information, and because each individual may have different experiences, this may lead them to have different beliefs—and thus, information may be perceived differently from person to person.

This idea is reflected in hermeneutics, the theory and practice of interpretation. Two particularly relevant hermeneutical thinkers are Martin Heidegger and Hans-Georg Gadamer, who essentially placed human understanding into the context of experience. Heidegger asserted that “understanding isn’t an isolated act of cognition but part of human existence, emerging from the assumptions and opinions generated by our concrete experiences in the world. Understanding, then, is rooted in history and rooted in time: it is always embedded in the observer’s experience” (p. 123, D’Alleva). Gadamer, his student, expanded on these thoughts. Gadamer wrote that it is not only the individual experience that affects our understanding, but also historical context, which is “a stream in which we move and participate, in every act of understanding.” This “stream” establishes prejudice, which is “the basis of our being able to understand history at all” (Palmer).

Understanding isn’t an isolated act of cognition but part of human existence, emerging from the assumptions and opinions generated by our concrete experiences in the world...

Understanding, then, is rooted in history and rooted in time: it is always embedded in the observer’s experience.

If it's true that our identities and beliefs are informed by experience and social context, how might the digital world impact the way we perceive the world?

Much of our communication is now online, on websites that are susceptible to capitalist motives. There is a financial incentive for a website company to keep users on their site, and content personalization, the marketing strategy which tailors content to a particular user, makes this possible. Especially in the case of social media, the more the content is personalized, the more likely the user is to spend time on the website. This exposes the user to more of the site’s advertising, and essentially benefits the website. In 2012, Facebook conducted a study in which it “filtered users' news feeds… One test reduced users' exposure to their friends' ‘positive emotional content’, resulting in fewer positive posts of their own. Another test reduced exposure to ‘negative emotional content’ and the opposite happened” (Booth). After the study went public and was ferociously criticized for not receiving user consent, Facebook maintained that the study’s purpose was to improve Facebook’s services and to make its content as engaging and relevant to the user as possible. It is clear that Facebook does not prioritize receiving permission from its users, and it can only be imagined that many Facebook users are not even aware that content personalization is used on their accounts.

The process of personalization is as follows: a Facebook user initially starts as a blank slate. The more information that is gathered about the user—beginning with location, age, and gender—the more targeted Facebook's content can be. It isn't simply the advertisements that become tailored by this information; it is also suggested friends, and the content of the user’s newsfeed. Information can be gathered from who you choose to be friends with, and what posts you click on or interact with. Besides the fact that different people will have different friends on Facebook, the content they are shown, regardless of the diversity of their friend group, will become specifically targeted to what Facebook's algorithm has determined will be most relevant to the user. As written by internet activist Eli Pariser, “Your identity shapes your media. There’s just one flaw in this logic: media also shape identity. And as a result, these services may end up creating a good fit between you and your media by changing… you. If a self-fulfilling prophecy is a false definition of the world that through one’s actions becomes true, we’re now on the verge of self-fulfilling identities, in which the internet’s distorted picture of us becomes who we really are” (Pariser). The algorithm used in content personalization may only reflect a shallow version of our identity—regardless, personalization would continue to expose us to things we want to see, and content which reinforces our beliefs and personalities; thereby disconnecting and isolating us further from understanding perceptions unlike our own.

On a superficial level content personalization is rooted in the idea that as information increasingly grows online, it becomes difficult for people to sift through what is relevant.“Information overload is no longer just a popular buzzword, but a daily reality for most of us. This leads to a clear demand for automated methods, commonly referred to as intelligent information agents, that locate and retrieve information with respect to users' individual preferences…” (Pazzani and Billsus). Thus, it could be argued that there is a utility in content personalization that benefits the user by eliminating superfluous content that wouldn’t have been of interest. Additionally, while it is easy to say that we are all open-minded and want to hear opposing viewpoints, this is not necessarily the case. It's undeniable that challenging the foundation of our core beliefs is unwelcome at best, and many people have trouble acknowledging that their ideas about the world may be wrong. One of the most interesting things to consider is that Facebook, in response to the criticism that it creates an echo chamber (a term referring to the reinforcement of pre-existing ideals through the repetition of information and beliefs), combated the criticism by claiming the echo chamber was largely constructed by users themselves. Zuckerberg, creator of the social media platform, said that the problem isn’t the lack of diverse information available on Facebook; rather, the problem lies in the fact that “people don’t click on things that don’t conform to their worldview” (Shahani). However, it isn’t only Facebook that uses the method of content personalization: in 2009, Google posted an article entitled “Personalized Search for Everyone,” which revealed that the company would be expanding personalized searches from signed-in Google users to every person who uses Google search. As stated in the article, “This addition enables us to customize search results for you based upon 180 days of search activity linked to an anonymous cookie in your browser.” This means that as of 2009, an identical search query would show different results, according to the data acquired on the Google users.

"People don’t click on things that don’t conform to their worldview."

—Mark Zuckerberg

Along with the huge variety of information online, there is also the addition of information that is illegitimate. Fake News is something that has severely impacted our political environment, and possibly affects the outcome of elections. The reason people believe and share false information online is often because these stories fit into the picture of the world that we see. On top of this, fake news also does a fair job of masking itself as a reliable source.

Your brain filters out what it hears and selectively attends to that which confirms its original hypothesis. Resisting these instinctive behaviours takes education and deliberate practice.

–Daniel Levitin

The most concerning ramification of content personalization is its limitations on the information shown to an individual. In a time when people increasingly obtain news about the world from the internet, often through social media, it is important to ruminate on the consequences of personalizing content. “Personalization filters serve up a kind of invisible auto-propaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown” (Pariser). Not a single individual can truly claim that they know the full truth of the world, because truth is subjective and according to people’s unique and different perceptions. However, the reinforcement of ideas through the support of content personalization can cause differences between people and their ways of thinking to become enhanced, if not completely hide the opposing viewpoint from the other’s perception. The phenomena of tribalism, which “reflects strong ethnic or cultural identities that separate members of one group from another, making them loyal to people like them and suspicious of outsiders,” also “undermines efforts to forge common cause across groups” (Kanter). In a world where human difference is already accentuated, the prospect of creating even greater divides should be deeply troubling. Democracy calls for a “reliance on shared facts; instead we’re being offered parallel but separate universes” (Pariser).

INTERVIEWS

In order to understand how beliefs differ, I decided to explore how the people around me understood and thought about the world. These interviews were in response to the same two-part prompt. The first part was a personal question, which aimed to humanize the video and ground the subject. The second section included a question about a controversial issue in current events, and is more political. This question was asked because it is commonly known for people to have a variety of thoughts on the topic, and would spark diverse responses. I attempted to find respondents with different backgrounds.

1. What brings meaning to your life?

2. What are your thoughts on the United State's current stance on immigration?