Conversation with Dr. Brandeis Marshall on data, deep learning, and the ramifications
Author: Ysabelle Kempe
Date: 07.28.20
The more automation, the better. At least that’s the prevailing belief in the computer sciences. In the broader world, however, this notion that technology is invariably “good” can have insidious effects, according to Dr. Brandeis Marshall, a computer scientist scholar and CEO of DataedX.
“We want to have tech because it is neutral, objective, and unbiased,” Marshall told her audience at “Deepfake Technology: The (Mis-) Representation of Data,” a virtual event held on June 26 by Khoury College of Computer Sciences. “But this particular type of language and sentiment has made it easy for there to be possible encroachments on our human rights and privacy.”
Marshall’s talk explored the capabilities and consequences of deepfake technology, which uses artificial intelligence (AI) techniques to alter a person’s image, likeness, or voice. But her discussion expanded beyond that, speaking on how the culture within tech can lead to unintended consequences that harm and target marginalized groups, such as Black, Latinx, Indigenous, and Asian communities.
The concept of deepfake, defined
The word “deepfake,” Marshall said, synthesizes the terms “deep learning” and “fake.” The latter term is self-explanatory — media created with deepfake technology isn’t real. The former, “deep learning,” is a subset of machine learning that mimics a neural network, passing data through a series of processes or “hidden layers.” An example of deepfake media, which Marshall showed, is this Buzzfeed video that looks and sounds like former President Barack Obama but was actually voiced by comedian Jordan Peele. While this video was made in jest, Marshall views deepfakes as a serious potential threat to society.
Deep learning, too, is problematic, she argued. “I think this whole concept of deep learning needs to be eradicated,” she explained, mentioning that there could be exceptions for artistic uses. From her perspective, “it can only cause harm since there are biases from the beginning which perpetuate and cascade throughout the process of deep learning.”
Deep learning mimics the human brain, which we all know is fallible. If a human has an idea based on false information, actions based on that idea will lead to false outcomes, Marshall said. This same flow of false information, she continued, can happen in the deep learning process if a model is provided with incorrect or biased data.
In this way, Marshall said, innovation can actually stall or reverse progress. Facial recognition and predictive policing algorithms, which claim to be able to predict whether a person will be a criminal, have track records of being discriminatory and biased against Black individuals, she explained. In Detroit, for example, police used facial recognition software that falsely identified Robert Julian-Borchak Williams, a Black man, as potentially guilty of shoplifting. Despite his innocence, police held and interrogated Williams for 30 hours.
Critical thinking about bias and algorithms
Ashley Armand, an academic advisor in the Align program, collaborated with Marshall to hold this event at Khoury. Prior to the event, Armand stated, “In partnering with Dr. Marshall, I intend to facilitate a space where students are encouraged to think critically about reducing harm and eliminating inequity perpetuated by biased algorithms and design frameworks in tech, and to be leaders in shaping and maintaining more inclusive and introspective spaces where critical minds are championed.”
“I’d like for our students to be more cognizant of negative biases against marginalized populations, and how they permeate in algorithms that can severely impact individual lives,” Armand added.
Marshall’s work goes hand in hand with that goal, focusing on the racial, gendered, and socioeconomic impact of data in technology. She designs data science pedagogy for marginalized communities as well as assesses the sociotechnical implications of Black Twitter. Her talk at Khoury came at a particularly pertinent time, amid nationwide protests against anti-Black racism — a longstanding issue, nationwide and globally.
“Dr. Marshall has an incredible wealth of knowledge. From her talk, we gained essential tools that amplified our skill sets, and theoretical frameworks to identify racial biases within tech in a candid and illuminating way,” Armand said after the event. “Dr. Marshall’s talk shed light on this issue with a heartwarming playfulness that healed and inspired all of us in attendance.”
Marshall believes expanded policy is integral to combatting both algorithmic discrimination and the exploitation of deepfake technology. The number of patents associated with AI, she said, increased fivefold between 2004 and 2014. The European Union’s General Data Protection Regulation was created in 2016 to address users’ data protection and privacy, but similar oversight has not been created in the United States, Marshall said.
“The moment we are at in this country is trying to better understand technology and all of its implications and ramifications,” she explained.
She pointed to multiple active bills in the U.S. House of Representatives to address issues relating to AI, deepfakes, and facial recognition technologies. This includes the Algorithmic Accountability Act of 2019, which would require companies to assess algorithms for bias and discrimination, and the DEEPFAKES Accountability Act, which would require deepfakes to be watermarked and described as such. And the legislative efforts don’t stop there.
In the past six months, Congress has made multiple moves to limit government use of facial recognition technology. This includes the Ethical Use of Facial Recognition Act, proposed in February 2020, and legislation announced in June that would ban government use of facial recognition and other biometric technology.
Policy, new initiatives, and education for change
In addition to policy, Marshall considers increasing diversity in the computer science industry as vital to developing better tech. As a Black woman in the field, she knows what it feels like to “buck up against convention.” There are so few other Black women in computer science, she said, that she personally knows almost every single one. In academia, too, Marshall has seen a lack of diversity — people who look like her are rarely elevated in traditional educational curriculums. Khoury College’s Align program, the sponsor of Marshall’s talk, is designed to invite everyone into the study of CS.
“White people that I worked with are good people, but they don’t understand being Black,” Marshall said in an interview with Khoury News following the talk. She referenced experiences as a Black woman in computer science illustrating a lack of respect and trust that she endured.
Marshall is determined to protect and elevate others in the field. She feels a responsibility to stand up for those who don’t have the economic stability to do so, such as students.
“Dr. Marshall looks like me,” said Sandra Kwawu (Align MSCS ‘23), who attended the event because she aspires to work in data science and wanted to hear from a woman who excels in the field. ”It gives me a sense of connection when listening to her share her work and knowledge.”
Bethlehem Mesfin (Align MSCS ‘23) called Marshall’s candidness “a breath of fresh air.” Many times, Mesfin said, she walks away from an event feeling like she didn’t get the full picture. But this one made her feel like she was part of the conversation and provided her with resources to continue learning. For example, Marshall pointed attendees to #ShutDownSTEM, an initiative for Black lives within the STEM space.
“I aimed for students to feel community, and learn about how tech is affecting them — especially those who identify within marginalized communities,” Armand stated. “I am happy to hear that community was built, and that this event propelled the desire for continuous learning, activism, and engagement.”
While Marshall knows many students want to get involved in the fight for justice now, she advises them to do so while still playing the “long game.” Finish your education, obtain the credentials, and get in the room where the decisions are made, Marshall explained.
“That’s what we need,” she said. “The grassroots level of enough people having the knowledge base to therefore say ‘No, this is not right. And here are the receipts on why this is not right.”