UC Merced Magazine | Volume XX, Issue VI

Not a lot of people are going to care if their weather report is written by a computer, but I would expect that a lot of people would not be OK with wedding vows being written by AI. — Professor Hanna Kiri Gunn

e new AI frontier means companies have weak obligations to disclose information to the government — and no obligation to disclose to the public. Furthermore, workers at AI companies are o en bound by strong con dentiality agreements and are only protected as whistleblowers when they report illegal conduct. Employees who raise red ags over ethical concerns have no protections, Gunn said, largely because there hasn’t been enough time to build legal guardrails around generative AI. For instance, many workers have said they worry their products manipulate users with authoritative-sounding responses, but these interactions are not de ned as human research or experimentation. But a psychology study that aims to change your mind in the same ways would be subject to ethical oversight, Gunn noted. AI products are immune from these research safeguards because they fall under “product development.” e question remains: Who or what is responsible when the consequences of AI aren’t as intended? “It’s one thing when a computer system messes up. It’s another when a person does,” Gunn said. “You can hold a person accountable.” In 2021, the University of California became the rst in the nation to adopt recommendations for ethical development and implementation of AI. Today, UC’s AI Council is translating those recommendations into practice with training and tools to help sta , faculty and students judge the technology's opportunities and risks. ese resources will be available later this year. The Hidden Environmental Cost of AI An o en-overlooked consequence of AI is the environmental footprint. Gunn said water — a precious resource — is critical for cooling servers in huge data centers that provide the massive computing power AI requires. Gunn points to a study out of UC Riverside that found in 2022, the AI sectors of Meta, Microso and Google used

the equivalent of one year of Denmark’s population’s water consumption — about 2 billion cubic meters. Researchers estimate AI’s need for water could reach up to 12 billion cubic meters by 2027. It is estimated that for every 10 to 15 ChatGPT responses, a computer system “drinks” a 500-ml bottle of water. A Need for Transparency? Gunn said when considering ethics, honesty and transparency aren’t the “end-all, be-all.” However, when it comes to AI and plagiarism, “it matters.” “Not a lot of people are going to care if their weather report is written by a computer, but I would expect that a lot of people would not be OK with wedding vows being written by AI,” Gunn said. “If you ask somebody, ‘Why do you love me?’ and they ask a computer to reply, that person is not telling you what they value about you,” she said. “What that model will give is some assortment of what it’s been trained to o er as responses to questions about why someone deserves to be loved.” Generative AI products learn by scraping the massive amounts of information available on the internet, everything from newspaper articles to “Pride and Prejudice” to Yelp reviews. Whether AI companies have a right to that information is a huge area of contention. Journalists, authors and musicians are examples of professions that are paid for authentic writing and composition. e New York Times, Getty Images, Universal Music and a group of authors that includes John Grisham and George R.R. Martin have led lawsuits claiming copyright infringement by AI companies. Gunn said she asks her philosophy students not to use AI for class assignments. “I want students to write essays so they can work through the di culty of developing ideas, developing arguments and working on their ability to explain,” Gunn said. “Using something like ChatGPT misses the entire learning opportunity.”

7 11

UC MERCED MAGAZINE // UCMerced.edu UC MERCED MAGAZINE // ucmerced.edu

Made with FlippingBook Annual report maker