On Friday, Twitter closed offices in London and other locations and started laying off parts of its workforce. Among the first to go was the company’s entire ethics staff, including its celebrated Ethical AI team and recent TechPoint keynote speaker James Loduca, now the former VP of diversity, equity, inclusion and accessibility. 

This move would seem to signal that ethics is no longer a priority at Twitter, at least not in the near term under new management and not in the visible way it has been since 2020. Just two years after emerging from Twitter’s research silo to being out front and scaling its work alongside the product teams as a main company priority, Twitter’s Machine Learning, Ethics, Transparency and Accountability (META) team is now disbanded and its employees fired. 

“Elon Musk firing the ethical AI team, policy team, etc is not surprising since his reason for buying Twitter was to put an end to their content moderation policies,” said Dr. Kirsten Martin, the director of the Notre Dame Technology Ethics Center (ND TEC), which recently launched a 15-credit undergraduate minor in tech ethics. “The problem is that the majority of users and advertisers like the content moderation policies.  So, this may make him feel better in the short-term, but he risks making Twitter a second-rate social network like Gab or Parlor.” 

At a time when the ethical questions and challenges we face as a global society are rapidly evolving and becoming more complex, the world’s real-time micro-blogging giant, which enjoys near public utility status, has hobbled itself and erased its institutional memory. It begs the question, what obligations do Indiana tech companies and tech pros who work at them have to responsibly build and use technology for the public good? 

Hidden values encoded in the system 

“Headlines about the ethics of AI-enabled decisions sweep across both business and popular press,” said Martin. “We see various types of organizations and entities use algorithms for everything from hiring decisions and online content moderation to loan approvals and the allocation of vaccines, yet the understanding of how these systems make decisions and the values that may be encoded in them is often lacking.” 

A nationally recognized expert in privacy, technology, and business ethics, Martin is the William P. and Hazel B. White Center Professor of Technology Ethics and a professor of IT, analytics, and operations in Notre Dame’s Mendoza College of Business. Before entering academia, she spent eight years working in consulting and the telecommunications industry. 

“Students are taught the ethics of leadership and marketing, the perils of banking and the financial crisis, product quality and the Ford Pinto case, unfair labor practices and the implications of outsourcing,” said Martin, who also holds a Ph.D. from the Darden School and hosts the ND TEC podcast TEC Talks. “However, discussions of the most morally fraught business topic of our time—the ethics of how we use AI and other technologies—are regularly missing from undergraduate education, leaving our future leaders ill-equipped to make the decisions they will inevitably encounter.” 

Dr. Eugene Spafford, a professor of Computer Sciences at Purdue University, where he has served on the faculty since 1987, cited the code of ethics and professional conduct of the Association for Computing Machinery (ACM) as a good starting point for people who want guidance on tech ethics. ACM is the world’s largest educational and scientific computing society. 

“The first precept says to contribute to society and the human well-being, acknowledging that all people are stakeholders in computing, and the second ethical principle is to avoid harm. The fact that those are the first two principles says something about their importance,” said Spafford, who helped develop ACM’s code of ethics and is considered one of the foremost experts in academia in computer science and cybersecurity.  

Spafford, or Spaf (as he is known to his friends, colleagues, and students) is the founder and executive director emeritus of the Purdue Center for Education and Research in Information Assurance and Security (CERIAS) He is Editor-on-Chief of the Elsevier journal Computers & Security, the oldest journal in the field of information security. 

Tech’s impact on fairness, power, privacy and autonomy 

“As a society we are undergoing some change. We’re going to have to develop some new thoughts about ethics and technology. It’s all part of the human condition and the evolution of what we do with communications,” Spaf said. “With machine learning and artificial intelligence, our biases, whether conscious or unconscious, often get built into these systems,” Spaf explained. For example, some photo and facial recognition systems were originally trained on white males so they perform badly on females and especially on people of color. A bias was encoded into them.”  

It may seem trivial to some, but it’s easy to see how tools and platforms that are supposed to bring us closer together or solve repetitive problems can quickly become exclusionary to certain populations when the teams that build the technology are overwhelmingly homogenous. 

“Technological advances have enormous potential to improve individual lives, increase the general welfare, improve quality of life, and reverse environmental degradation,” added Warren von Eschenbach, ND TEC’s associate director for academic affairs, who worked with Martin to develop the tech ethics minor at Notre Dame. “But achieving these goods isn’t a given. Rather, it requires that the development and application of new technologies be subject to ethical analysis and integration.” 

Spaf took it one step further and declared: “It’s never okay to say the computer did it.” 

“Well, almost never,” he clarified. “It’s a matter of who programmed the computer. We can’t remove the personal responsibility for that decision making. This gets even more critical when we talk about autonomous vehicles, whether they’re automobiles, aircraft or military systems. The safety of people can be badly impacted by biased programming decisions being encoded into the system.” 

Ethics checklists are fine, but inadequate 

“Obviously, each iteration of technological innovation is jolting,” Martin said. “But we’ve analyzed the ethics of bicycles, trains, automobiles, and the highway system, to name just a few, each of which was seen at the time to be equally groundbreaking, and we have all these great tools for doing so from business ethics and other ethical traditions. ND TEC’s undergraduate minor in tech ethics will help students apply time-tested theories concerning issues like fairness, ethics, power, privacy, and autonomy in the rapidly evolving technology space.” 

One speed bump many organizations run into with ethics is failing to consider the impact of a particular technology until the very end. But ethics is not something that can be adequately applied after the fact, according to Spaf. 

“Quality, security, privacy, ethics—all of those are things that need to be built in rather than added on with a checklist. And that involves training, that involves attention, and that involves having it as an essential precept as you go about building whatever you’re building,” Spaf said. “You can’t build something quickly to get it to market and then add on quality. It doesn’t work that way. It must be an objective from the very beginning if that is in fact a value that you want to hold.” 

As the social media platform that has been the best indicator of the wider pulse of the world and what’s happening within it, Twitter itself valued immediacy and a frictionless pathway for everyone to create and share ideas with the world. The META and Ethical AI teams, DE&I staff and others working on the ethical conundrums of such a mission were viewed internally as a big part of Twitter improving—and not detracting from—a free and global conversation. It will be interesting, to say the least, to see what the company values under new management and if it remains or strays from instantaneous and egalitarian access to communicate with the world.