To speak out or remain silent? Tech ethics still roils industry | News

[ad_1]

“That feels unethical somehow, but I’m having difficulty explaining how. It just makes me feel like a bad person,” a Facebook software developer told their boss, according to court records.

The developer was being asked to make it harder for rivals to connect to Facebook by putting “restrictions on firms that might pose competitive threats” in 2013, the Federal Trade Commission says in an ongoing antitrust lawsuit. “I’m just dumbfounded,” a colleague replied in an email. Another said, “It is sort of unethical.” A third said, “I agree it is bad.”

The scenario is not unusual, ethics experts and tech workers say. Employees regularly encounter complex ethical issues related to big data, computer privacy, artificial intelligence and other issues where the rules often haven’t been established, says Brian Green of Santa Clara University’s Markkula Center for Applied Ethics.

The stakes are high; the guidance low. Twitter’s moderation of Donald Trump was a key factor in two presidential elections. Apple’s dispute with the FBI over accessing data from a shooting suspect’s iPhone was a major privacy debate. Controversy derailed Google’s Ethical AI team for months.

When faced with unethical business practices, the question for many employees becomes: Do I speak out or remain silent — and what are the costs for either action?

Green, the center’s director of technology ethics, says tech presents new ethical predicaments to employees not prepared and trained to handle them. For example, social media companies have struggled with political disinformation, an age-old problem that is now supercharged, due to the speed of the messages, the size of audiences and the revenue that can be gleaned from targeted, data-driven advertising.

“Propaganda has been a problem forever,” Green says. “But when you talk about scaling it up to millions of people, then it becomes a really huge problem.”

Computer science students now get a healthy dose of ethics at many top tech universities. The mission of Stanford’s Ethics, Society and Technology Hub is to “generate a fundamental shift” in how all faculty and staff “think about our role as enablers and shapers of scientific discovery and technological change in society.” Cal’s Center for Technology, Society & Policy has similarly lofty goals.

But perspectives can change when those tech worker paychecks start to roll in. The estimated total pay for a Google software engineer who has just graduated from college is $198,873 per year, according to the careers data website GlassDoor.

The Facebook Papers

An example of tech workers struggling with an ethical issue that has major impact on the world is pointed out in the “Facebook Papers,” documents released by whistleblower Frances Haugen and published by Gizmodo. Employees struggled with political ethics issues in August 2020, three months before the election, the documents show.

“We’ve recently seen reports from internal employees expressing discomfort” because Facebook’s algorithm was recommending posts to users from “pages that post highly partisan content,” a company memo said. One post being debated by employees was not political, but it was from Ben Shapiro, a conservative pundit who had accused Facebook of censoring right-wing opinions.

Should the company allow its algorithm to recommend Shapiro’s posts, or shut down the recommendations because they led to highly partisan activity?

Despite the employees’ discomfort and the urging of some teams to take action, Facebook chose not to shut down the recommendations of right-wing pundits to users before the election, according to the documents published by Gizmodo.

Meta did not comment when asked questions for this article. The company does however assign human resources employees to help process personnel issues, including ethical complaints. Employees can also use a “speakup” hotline that allows anonymous reporting of issues, and Meta tech projects often include employee feedback throughout their process.

Whistleblowing or silence?

But what happens when there is no sufficient forum for employees to discuss their discomfort at contributing to work they find unethical?

Employees stop resisting and glaze over, shutting down ethical instincts that previously defined their character and sense of identity, says Green. “People start ignoring things,” he says. “They start to normalize it, and large sections of the organization start going astray.”

Whistleblowers also risk being fired and ostracized, despite a new state law limiting how much companies can restrict employees from speaking up by enforcing nondisclosure agreements, the legal documents that have gagged tech workers for years.

About 85% of whistleblowers suffer from severe to very severe anxiety, depression and other mental health issues, at levels comparable to what is experienced by cancer patients, a Dutch study found.

Apple whistleblower Cher Scarlett says she “slipped into a very dark place,” which led to her overdosing on Fentanyl she accidentally ingested. When Scarlett spoke up at Apple, saying the company was trying to stop employees from gathering salary data in order to investigate pay equity, a group of employees was with her, though not for long.

“One by one, they all backed out,” Scarlett says. “And one of the last people said to me, ‘If it wasn’t just you and me, I would stay, but I can’t. I just can’t put myself in that position. And I admire you for taking this for all of us.’”

Scarlett doesn’t blame them. “I empathize. I get it. What happened to me, I wouldn’t wish on anyone else. But at the same time, if whistleblowers weren’t doing it alone, maybe the consequences couldn’t be so severe.”

Not everyone can live with holding their tongue. Before Susan Fowler blew the whistle about sexual harassment at Uber and elsewhere, she chose not to speak up about sexual harassment and sexism she experienced in academia. She regretted that.

“Could I have helped shed light on the awful ways that some universities and colleges silence and retaliate against their students?” Fowler wrote in her book, “Whistleblower.” “I’ll never know. All I know is this: I didn’t speak up, because I was afraid. I didn’t do what I knew was right, because I was afraid. And I vowed that I would never make the same mistake again.”

How has speaking up about ethical issues become so frightening that whistleblowers have the same level of anxiety as cancer patients? How can tech make it easier for employees to come forward with concerns?

Emerging ethics forums

Employees on the forefront of new technologies should not face ethical issues alone, Green says. Companies must build into new technologies the capacity to identify, point out and process ethical issues. “If a company doesn’t have those things yet, then there’s a question of how do you bring it up?” Green says.

Also, fast-growing startups should assume they will encounter ethical issues  for which they are unprepared. “They might actually have their own kind of lack of maturity or lack of understanding of how to deal with ethical problems just because they’re young and they lack experience,” Green says. “At some point, they need to become responsible adults.”

Green has worked with tech companies to “operationalize ethics,” or build capacity for ethical questions into processes — giving a time and place where employees are encouraged to speak up about issues in a way that has been gut-wrenching in the past.

Through a program called Responsible Use of Technology at the global think tank the World Economic Forum, Green has worked with IBM and Microsoft to create new processes that address ethical issues.

Microsoft set out to “shift a culture” around its artificial intelligence products, which had run amok in 2016 when a bot named Tay began spewing racist tweets after Twitter users hijacked its programming. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay,” the company wrote in a blog post.

The embarrassing incident raised a larger issue: Was Microsoft’s AI program rooted in safe and ethical practices — or was the company releasing AI that could unleash real danger on the world someday?

The company created “tools for responsible innovation” that must be applied to all new AI products, assessed AI employees twice a year on their grasp of bias and fairness, and mandated basic ethical AI training for all employees. The tools are available to the public.

The program was deemed a success by outside auditors — but not perfect. Some employees thought it was vague, others felt it didn’t apply to them. The company persevered.

UC Berkeley researchers found that Microsoft’s program “stands out because it provides a clear signal to employees, users, clients and partners that Microsoft intends to hold its technology to a higher standard.” The Berkeley scholars based that finding on interviews and conversations with Microsoft executives and employees, and a review of public documentation and media.

The research includes an insightful anecdote that shows the program’s promise and limitations. In 2019, a group of 200 employees signed a petition protesting Microsoft selling AI products to the U.S. military. “We did not sign up to develop weapons, and we demand a say in how our work is used,” the employees wrote.

Microsoft CEO Satya Nadella rejected their plea, saying, “We’re not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy.”

But employees were allowed to switch teams if they disagreed with a project for ethical reasons. Also, a working group created in Microsoft’s ethics program did reject other AI contracts the group felt could be misused, and the group continues to monitor the military’s use of Microsoft AI products.

Most importantly, Green says, Microsoft has built “a formal place to bring these concerns up.”

Place your free digital obituary

We provide a free service for you to honor your loved ones. Click below to get started.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Challengers to Travel Vaccine Mandate Ask Court to Hear Their Case as AG Seeks Dismissal
Next post All is not lost! (Except maybe your luggage): Travel Weekly