“Why do people do this?”
No images? Click here Good evening, Meteor readers, As we amble our way towards November (it’s RIGHT there), I am thinking of entering a season of gratitude. With the news awash with tragedy—the mass shooting in Maine, the devastation of Hurricane Otis in Acapulco, and the continuing crises in Gaza and Israel—I am more aware than ever of how often I take for granted life’s small pleasant surprises. Let’s all choose to be better about that. In today’s newsletter we look at the inadequate gun laws that made way for the Maine shooting, share our weekend reading, and hear from one of the women who first blew the whistle on AI’s biased ways. With love, Shannon Melero WHAT’S GOING ONTHE BAR WHERE PART OF WEDNESDAY’S MASS SHOOTING TOOK PLACE. (IMAGE BY SCOTT EISEN VIA GETTY IMAGES) Lewiston, Maine: It started in a bowling alley on Wednesday night. Officials say that around 7 p.m., a gunman entered Sparetime Recreation and started firing rounds from an “assault-style weapon.” He eventually fled the scene and drove to a local bar to continue the shooting spree. So far, 18 people have been confirmed dead, with 13 injured. One of the wounded, a 10-year-old named Zoey, who was there with her youth bowling league, told a local news station, “I had never thought I’d grow up and get a bullet in my leg. And it’s just like, why? Why do people do this?” The simplest answer to Zoey’s question is: because they can. The suspected gunman, Robert Card, is a firearms instructor and a U.S. Army Reservist. As early as this summer, military officials had “concerns” about Card, and later reported him for erratic behavior while supporting summer training for cadets at West Point. Card was later admitted to a mental health facility, where he claimed he’d been “hearing voices and threats to shoot up” West Point. Despite all of this, Card still had access to firearms. That’s because in Maine—where gun laws are some of the most obscenely lax in the country—just about anyone can legally acquire a firearm. There are weak background check laws, no waiting periods, and no red flag laws, which prohibit gun purchases or possession for anybody who shows signs of being a threat to themselves or others. (Hearing voices and threats to shoot up West Point would count.) Not only that, Maine has had a “permitless carry law” since 2015. This means that any person over the age of 21 “or at least 18 and active duty or honorably discharged military” who owns a gun can carry “loaded, concealed handguns in public without a permit or background checks.” It should not take a mass shooting to enact the most basic gun safety laws. It’s too late for the 18 victims and their families, but it isn’t too late for Zoey, or the rest of us. You can learn more about safety initiatives in Maine from The Maine Gun Safety Coalition and check the laws in your state at Everytown for Gun Safety. AND:
“The Past Dwells in our Datasets”Can AI ever be unbiased? MIT scholar Dr. Joy Buolamwini has some answersBY REBECCA CARROLL (IMAGE BY PARAS GRIFFIN VIA GETTY IMAGES) In the celebrated Steven Spielberg movie A.I., set in the 22nd century, scientists create an android boy capable of experiencing human emotions. The film (and its all-white cast) try hard to convince us that at its best, artificial intelligence can learn to love. That was 2001. More than 20 years later, we’re discovering that AI—which now mostly takes the form of computer programs rather than robots—often actually reflects real people’s worst biases. And those embedded prejudices are doing a lot of harm, argues computer scientist, digital activist, and “poet of code” Dr. Joy Buolamwini in her new book, Unmasking AI: My Mission to Protect What is Human in the World. Born in Edmonton, Canada, Dr. Buolamwini spent her early years in Ghana before moving with her artist mother and scientist father to Mississippi when she was four. Just five years later, she saw an MIT-made robot called Kismet on a PBS science program—and decided on the spot that she was going to attend MIT to study robotics. But, once at MIT, she quickly realized her calling was much bigger. As she learned about the ways racial, gender and ableist biases had crept into facial recognition technology, she launched the Algorithmic Justice League, an organization that uses art and policy to advocate for equitable AI systems. I talked to Dr. Buolamwini about not just the harm but the potential of AI. Rebecca Carroll: You write that AI developers promise that AI will “overcome human limitations”—what does that mean exactly? Dr. Joy Buolamwini: AI is presented as enabling humans to be more efficient, more productive, and help overcome human limitations. For example, some AI tools for hiring have been presented as an alternative to human decision-makers who we know can be biased. The problem is AI tools are often created with datasets that reflect past decisions. So AI tools trained on past hiring decisions will reflect the preference and prejudice of the past and make them into current technologies. Amazon found this out when it attempted to make a hiring tool that was shown to discriminate against women. They ended up getting rid of the tool because even after they tried to address the bias, it still favored men. I think it is easy to assume technology will be more neutral than human because technical systems are mathematically based, but it is important to remember that the past dwells in our datasets. You talk in the book about the “arbiters of ground truth”—the idea that data or statistics offered through empirical evidence is the only real truth. How does AI complicate that view? The more I do work on AI, the more I see the importance of storytelling. [One’s] lived experience matters. As a graduate student, I was reading about so many AI advances, yet I found myself coding in a white mask to have a computer see my face. I recorded that experience as something I call a counter-demo. Like a counter-narrative, a counter-demo captures an experience that challenges master narratives of who or what is considered normal or worthy, and in this case master narratives about technological advances. My spoken word poem, “AI, Ain’t I a Woman?”, which is also a test of various AI systems, contains many counter-demos of tech companies failing [to identify] the faces of iconic women of color. It challenges the narrative of tech superiority when we see AI failing [to recognize] the faces of Oprah Winfrey, Michele Obama, Serena Williams, and more. I smiled when I read about your first meeting with Timnit Gebru, where you noted that you both wore your hair natural—itself the subject of centuries-long racial profiling and discrimination. Are there specific policies you’re seeking to put into place to help fight against “the coded gaze”? I dropped a few hair references throughout the book, and I am so glad you noticed! I think governments around the world can learn from the EU AI act, which puts a specific ban on the live use of facial recognition technologies in public places. We can also stop police from using facial recognition for investigative leads, as this dangerous use of AI has already led to false arrests, including the arrest of Porcha Woodruff. AI-powered facial recognition misidentification led to her false arrest for carjacking by the Detroit police department. She was eight months pregnant sitting in a holding cell and having contractions. Three years before Porcha’s arrest, Robert Williams was falsely arrested for theft in front of his two young daughters by the same Detroit police department. Despite ample evidence of racial bias in facial recognition technologies, we still live in a world where preventable AI discrimination is allowed. The excoded (those harmed by AI) include even more people falsely arrested due to AI, like Michael Oliver, Nijeer Parks, Randal Reid, and others whose names may never make headlines, but whose lives matter all the same. You call yourself “the poet of code”—is that a nod to the fact that you are, as you share in the book, the daughter of art and science? Absolutely. Growing up, I saw art and science as companions just like my parents. I also see the role of a poet as showing us perspectives that may be marginalized, ignored, or largely unseen. In my poetic work with evocative audits, I use the spoken and written word to humanize AI harms. Poetry takes me to places research papers cannot go. When you were at MIT, you took a class with the Harvard professor Karen Brennan, who posed the question to you and the other students: “What will you do with your privilege?” How would you answer that same question today? I answer it by using my platform to give voice to feelings and perspectives that might otherwise be suppressed. Here is a recent poem I wrote called Angels Awake, on recent man-made and natural disasters: Heart aching, tears breaking, cycles of pain continue. Spine snapped; hope broke… The earth trembling too. Geography and circumstance shaping the fate of precious souls. Today, I walked through beautiful gardens and sat in chairs of opportunity. My life blossoming as the petals of others fade. How can I not be compelled to use every privilege given to offer healing to deepening wounds? Yet, where do we start when the tsunami of history overwhelms? Seismic shifts are centuries in the making. As the night carries on, I remain awake and agitated, grateful to at least be alive to take another breath… A gift lost by many others far too soon. The aftershocks are coming. Where be our better angels now? This interview has been slightly edited and condensed for clarity. Rebecca Carroll is a writer, cultural critic, and podcast creator/host. Her writing has been published widely, and she is the author of several books, including her recent memoir, Surviving the White Gaze. Rebecca is Editor at Large for The Meteor. WEEKEND READS 📚On the river and the sea: A slogan of Palestinian freedom is once again under fire for potentially being antisemitic despite many, including this scholar in 2018, explaining its origins. (Forward) On romance: Who would have thought the savior of Bachelor Nation would be this guy? (Vulture) On TV: Not in the mood to read? It happens. The HBO documentary No Accident, which follows a lawsuit brought by those injured while counter-protesting the 2017 “Unite the Right” rally in Charlottesville, is both riveting and informative. FOLLOW THE METEOR Thank you for reading The Meteor! Got this from a friend? Subscribe using their unique share code or sign up for your own copy, sent Tuesdays and Thursdays.
|