Skip to main content

Tech expert gives AI wake-up call: 'Wolf'' is here

With AI tools proven to be unreliable and not particularly truthful, is it time to put a pause on its revolution? Tech experts and academics Pengcheng Shi and Gary Marcus discuss.

Over the next five years, hundreds of millions of white collar workers around the world will see their jobs get replaced by the leading innovation in the tech sector: AI. But one technology expert has warned the threat is already here.

"Watch the society, the way we live, the way we work could be quite different," Rochester Institute of Technology Associate Dean for Research Pengcheng Shi told FOX Business’ Lydia Hu Thursday in a "Big Money Show" segment. "It's not crying wolf, per se, it isn't really a wolf at the door, because it is here."

Around 300 million full-time jobs could be affected by artificial intelligence and its ability to replicate basic workplace tasks, according to a new report written by Goldman Sachs economists. They also estimated that 7% of the current U.S. workforce could also become replaced by AI.

Examples of industries that face the biggest risk of replacement include office and administrative support, legal, architecture and engineering, business and financial services and sales, according to the research.

‘SHARK TANK’ STAR OFFERS A.I. SOLUTIONS TO PREVENT MASS SHOOTINGS: ‘MONITORING THE OUTPUT OF DISTURBED PEOPLE’

"They say anything that includes scraping for information, that's easily automated," Hu mentioned in her Thursday report. "So the input of data, reading from websites, think about intakes from clients, that's a huge part of what can be automated and streamlined."

But one New York University data scientist and professor called for the world to take a pause on the advancement of AI, signing a letter recently with 1,000 other experts and CEOs who want to see regulation.

"Short-term risk, there's a huge risk of misinformation at scale just completely disrupting our political process. There is a huge risk of cyber crime. Europol just put out a report about many of the different ways in which these systems might be used," Gary Marcus explained on "Cavuto: Coast to Coast" Thursday. "There's long-term risk in terms of the fact that we don't really know how to control these systems."

Marcus compared the AI conflict to a classic cinematic thriller: "Jurassic Park" – just because we could, he argues, doesn’t mean we should.

"We have a huge number of unknowns here, of unknown unknowns," the professor said. "So we have this perfect storm of corporate irresponsibility, huge rapid deployment like we've never seen before, hundreds of millions of people using these things and basically no regulation. It's nice that Congress is talking about it, but there really isn't any regulation yet."

AI and associated tools like OpenAI’s ChatGPT have been known as "unreliable" and "not particularly truthful," Marcus noted. A Fox News Digital investigation recently showed the chat bot refused to write a political poem admiring Donald Trump, but happily obliged when writing a poem admiring Joe Biden.

Top entrepreneurs like Mark Cuban and Elon Musk have also raised concerns, warning AI models could "start taking on a life of its own."

GET FOX BUSINESS ON THE GO BY CLICKING HERE

"I would say that there's a big dream now of having chat engines do search instead of getting back into web queries. You get back a paragraph, the problem is it doesn't work right now. Nothing in the ban we're calling for prevents people from actually improving that process," Marcus said.

"The stuff doesn't really work yet. It's like driverless cars," he added, "in 2012, everybody's like, they're going to be here next week. Eleven years later, they're not. I think we're going to see the same thing with chat-style search. You can build a demo now, but whether you can make it so you can trust it, it's a whole nother thing."

READ MORE FROM FOX BUSINESS

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.