60th Anniversary First Presidential Lecture Series

Image credit: Giulia Giordo
AUP hosts "Creativity in the Era of Big Data" presented by Seth Farbman

The American University of Paris has begun a lecture series called Technology and the Human Future in the 2021-22 academic year. It is open to the public and presented via Zoom. The series introduces the context for AUP's new Master of Science in Human Rights and Data Science degree. It will consist of eight different lectures aimed to address both theory and practice surrounding technology and its impact on human beings and societies.

Three of the lectures are being presented by AUP Professors while the remaining are external professors or technology experts. Seth Farbman, an executive fellow of Yale University and former CMO of Spotify, opened the first lecture in the series. Farbman was named the World's Most Innovative CMO by Business Insider, and for the eighth year in a row, was ranked one of Forbe's Most Influential CMOs. 

Screenshot taken during the lecture

Image credit: Giulia Giordo

 

The lecture first began with a few opening remarks from AUP's President Celeste Schenck where she first introduced the concept behind the lecture series and Seth Farbman himself. A general understanding of technology and the human condition, whether or not it is good or bad, was established at the start of his presentation. Farbman explained it is unreasonable to think that technology can be either of the two because tech is evolving much faster than humans and thus the effect it plays on human psychology is difficult to comprehend. However, he explained that with the rise of technology there is a larger sense of decentralization. This has allowed for people to be ever more connected with each other but is also directly correlated with the growth of propaganda.

Farbman went on to describe the inter connectivity of technology and how it has been facilitated by big data. There is an enormous amount of data that is affecting our decisions knowingly, and unknowingly, with 2.5 quintillion bytes consumed per day. He argues that with all this data we have almost unlimited to access to information but our knowledge is not increasing at that same speed and could potentially decline due to the effectivity of technology. Farbman highlights three main areas of technology that create and facilitate the consumer experience: personal data, algorithms, and facial recognition. 

Europe has taken the lead in protecting personal data and prioritizing it as a right of the individual. Although this is true and can be seen by the various EU Data Protection Laws, Farbman states that society has entered, somewhat unknowingly, a social contract where individuals willingly share their personal data so companies can use it on their behalf. This exchange would include the individual gaining access to services that are meaningful and targeted to them, such as suggested ads based on Google search history. He then emphasized the philosophical considerations to make with personal data, asking, "Is the loss of personal data the loss of the core right to exist?" Farbman answered his own question, but added a qualifier as there are no definite answers. Rather there is an accumulation of questions.  What would happen if all our personal data disappears? Will that allow the world to be more transparent and replace narcissistic online behavior with understanding and empathy? Is it possible that there would be less confusion?  Farbman responds, "No idea, but these are the sorts of questions."  

Farbman stated that the important personal data that each member of society contains is only usable when it is applied to an algorithm. He explained how algorithms either contribute to a positive aspect or a negative aspect in this so-called social contract within our connected society. He denotes these algorithms as being tools to reduce people into their least common denominator by using social media bubbles or filter bubbles. 

Farbman then highlighted another crucial and evolving technology: facial recognition. He pointed out his frictionless experience with American Customs where the U.S. government uses biometric technology to compare a photo taken at the customs booth to the one in the person's passport. Farbman states that although it has positive effects, facial recognition's negative effect is significantly more impactful. He showed a brief clip of MIT's media lab fellow Joy Buolamwini, part poet and part data scientist. Her work consists of understanding the quality of data sets that come from big firms like Amazon, Microsoft, IBM, Face++, and Google. She highlights that AI facial recognition most commonly identifies white men. This technology is almost never able to critically diversify the images it is attempting to recognize and black women are never gendered correctly by their programs. 

 

The second part of the lecture consisted of the story behind Spotify, how the record industry led to its formation. Before it was created, record companies were losing their intellectual property through the use of peer-to-peer file sharing software such as Napster. This led many companies to sue with the overall consensus that internet and music did not mix. Daniel Ek, an engineer and musician, wanted a way to listen to music legally without having to violate piracy laws in Sweden. Thus, Spotify was created in order to decrimilinize downloading and listening to music, something that people had been doing for a while with software such as Napster.  Farbman explained that Spotify was able to grow as much as it did because of their consumer model, later mimicked by the majority of technology companies. He calls it the Premium model where you start with the audience and get as many people on the service as possible. The more people, the more diverse audience and thus diverse music taste. He says that this is all because of big data, where data allows for creativity. A popular concept of Spotify, that was created accidently, is the Spotify Wrapped.

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

A post shared by Spotify (@spotify)

It was created in 2015 and provides a dashboard of listening behaviors and habits not only the individual level but also a country summary.  He explains it to be "a gift" where there is an emotional driver behind data and allows for people to reflect on their emotions throughout the year. He concludes the talk by stating that although there are negative consequences to technology, he is hopeful in humanity and that when data is true and passionate it should be considered a privilege. 

At the end, there were a few questions from Professors such as: Susan Perry, Claudia Roda, and Georgi Stojanov. Additionally, there were a few student questions. The first question consisted of the ethical issues pertaining to Spotify in how they respond to popular musicians in lawsuits. Farbman explained that the view of Spotify "has always been diversity of content" where it is not their place to be deciding what should and should not be on the platform. He does however point out that Spotify works with numerous legal teams and experts such as Southern Poverty Law Council in understanding the context. He states that "they cannot make cultural choices from their own cultural decisions." Professor Susan Perry then posed a question on what Seth Farbman thought of the prohibition of use facial recognition technology in the current EU draft legislation.  Farbman states that "people who create technologies should not be left on their own on the application of technologies" where the  government and educational institutions are the major actors in providing unbiased research and context. Professor Perry responds to this by highlighting that the technological structure is not safe especially for students where there is a lack of safe guards put in place. He finishes the discussion by stating that it is crucial for students to be focused on technology and guard rails for technology where "we cannot allow technology to find a market on its own."

Further questions were posed such as those by AUP President Celeste Schenck consisting of why Farbman centralizes his thoughts about data privacy on the EU where people are becoming ever-more aware of big tech and its implications. He responds by stating what he had concluded the talk with, that he is pinning hope on citizens in making the right technological choices.  Further philosophical considerations were posed by Professor Georgi Stojanov such as the reasoning as to why  people are being reduced to the least common denominator. 

Farbman concludes the presentation by stating:  "I place my trust in humanity. If I look at the long tale of humanity it is not a straight line and will always be an arc of humanity that bends ever so slightly towards justice and goodness. I think we are just in a very pivotal stage but I am going to trust humanity."

If you would like to participate in the next events you can register on the AUP Presidential Lecture Series page.

Written by Giulia Camilla Giordo

International and Comparative Politics Major with a minor in Information and Communication Technologies. Born in Rome, raised in Washington D.C and Prague. Who knows where I will be next...