The award-winning news site of Cosumnes River College

The Connection

The award-winning news site of Cosumnes River College

The Connection

The award-winning news site of Cosumnes River College

The Connection

Annual symposium discusses ethics of artificial intelligence

Michael+Pelcazar%2C+of+the+National+University+of+Singapore%2C+presents+his+remarks+on+the+ethics+of+artificial+intelligence+at+the+18th+Annual+CRC-CPPE+Fall+Ethics+Symposium+on+Oct.+9+at+Sac+State.+Day+two+of+the+event+took+place+in+the+Winn+Center+at+Cosumnes+River+College+on+Oct.+10%2C+featuring+a+panel+discussion+on+the+use+of+AI+in+education+and+economics.
Seth Henderson
Michael Pelcazar, of the National University of Singapore, presents his remarks on the ethics of artificial intelligence at the 18th Annual CRC-CPPE Fall Ethics Symposium on Oct. 9 at Sac State. Day two of the event took place in the Winn Center at Cosumnes River College on Oct. 10, featuring a panel discussion on the use of AI in education and economics.

The California State University, Sacramento Center for Practical and Professional Ethics and Cosumnes River College Honors Program collaborated in hosting a two-day symposium on Monday and Tuesday that debated the ethics of artificial intelligence.

The first day of the 18th annual collaborative symposium took place on the Sac State campus with roughly 200 people in attendance. CRC Philosophy Professor Richard Schubert partnered with Sac State Philosophy Professor and Director of the Center for Practical and Professional Ethics Kyle Swan to organize this series of the symposium.

A panel discussed the possibility of creating technology with consciousness indistinguishable from human beings. The panel also discussed the harms and benefits that AI could have. The first concept was presented by the Associate Professor of Philosophy Michael Pelczar from the National University of Singapore.

“You hear a lot of conversation about the dangers that AI potentially poses to human beings,” Pelczar said. “You have the risk of AI harming humans. You also hear a good talk about the risks of humans using AI to harm one another. What you don’t hear very much talked about is the risk of human harming AI.”

Sac State Professor of Philosophy Matt McCormick discussed the reality in living with highly advanced technology that could be considered conscious or sentient. He said AI could soon have legal rights and be looked at as a citizen, which could bring a moral singularity, collapsing the infrastructure of our civilization’s economy and culture.

“The moral singularity is a crisis where the evolved human equilibria of moral rights and obligations is inundated by the machines,” McCormick said.

Twenty-two-year-old Sac State history major Drew Harris said that AI was an interesting topic to talk about at the symposium. He said he was interested in the effect AI has on our morality and the ramifications or questions to consider as the technology develops.

Psychology and political science major Gabriela McMorris, 19, who is also in the CRC Honors Program, said she has participated in a few symposiums and she came to understand AI as best as she could.

“It’s always a good experience to hear from academic professionals, it’s a great opportunity. You get to see it live,” McMorris said.

Dr. Edwin Fagin, a CRC economics professor, said industries of science and technology will decode AI just like they did for DNA and it will become common knowledge.

“We’re just beginning to explore it, chart it and track it,” Fagin said.

The second day of the symposium took place in the Winn Center at CRC with roughly 90 people in attendance on Tuesday, featuring Program Director of Academic and Student Programs at George Mason University Rosolino Candela.

Candela said his main topic of discussion was AI’s place in the market for allocating resources efficiently.He said economic markets are imperfect and a process called economic calculation is necessary for maintaining them, as those calculations cannot be replaced by artificial intelligence.

Candela said it is difficult to determine genuine choice in AI because people know subjectively what they want. He said preferences would be given to an AI instead of demonstrated and interpreted by people.

A panel discussion between the three keynote speakers, Honors Student Personnel Assistant Sopuruchukwu Nwachukwu, Sac State Associate Dean for Budget and Assessment, College of Arts & Letters Christina M. Bellon and Sac State Lecturer Kevin Vandergiff ended the presentation, discussing the ethics of AI within higher education.

“I think there was a general freak out about two years ago with the release of Chat GPT,” Bellon said. “Those of you who follow AI were aware that something was being developed and then the good developers of Chat GPT said, ‘Let’s just throw it out there for everyone to use. What could possibly go wrong?’ and everyone had visions of ‘Terminator’ and how it could take control, but the primary concern seemed to be around student cheating.”

Bellon said it is not only student cheating that could be a problem, but also faculty members cheating or taking shortcuts.

Bellon said AI is currently being used in education systems as a tool that improves assisted technology.

“It has the promise to improve currently utilized assisted technologies for disabled students, disabled faculty and disabled staff to better perform their jobs and promote their own success,” Bellon said.

Bellon said if the inequities already existing in the academic world are not attended to, they will be exacerbated if AI is expected to be a tool in education.

“AI is not transparent. Chat GPT doesn’t cite its sources. So, I can’t cite the sources that Chat GPT used,” Bellon said.

Bettina Le, a biology major, 18, said she learned a lot more about AI, but she is still skeptical.

“You still have to use it in a smart way. It’s definitely not something you should play around with. It could be really beneficial if used correctly,” Le said.

The panel discussion on ethics in education ended the presentation and symposium with a Q&A session.

“I think with all AI and similar tools, it depends how we use it and what purpose we’re using it for,” Bellon said.

Leave a Comment
About the Contributor
Seth Henderson
Seth Henderson, Editor in Chief
Seth Henderson is the Editor-in-Chief for The Connection. He decided to join the Connection because he wants to become a reporter and broadcaster, hoping to work in Las Vegas for the Raiders and the NFL. He was grew up in the Bay Area and is passionate about journalism, music and sports.

Comments (0)

All The Connection Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *