Generative AI’s Mainly Experimental State Acknowledged by Executives


Executives at the Reuters NEXT conference in New York acknowledge that despite the global sensation caused by ChatGPT, generative AI remains largely experimental, with limited practical applications.

Lessons Learned:

Anthony Aguirre, founder of the Future of Life Institute, highlights the gap between generative AI’s capabilities and its suitability for specific purposes. While ChatGPT has impressed with its diverse outputs, the tendency to “hallucinate” erroneous information has limited its transformative impact across industries.

Challenges in Full Deployment:

Examples like self-driving cars illustrate the difficulty of transitioning experimental technologies to full deployment. The struggle lies in achieving reliability comparable to human performance, a challenge more formidable than anticipated.

Current Deployments:

Generative AI is finding utility in various applications, particularly in writing computer code. Microsoft’s Copilot on Github assists programmers by suggesting lines of code, enhancing productivity. AI-generated summaries of meeting transcripts also showcase the technology’s practicality.

Financial Sector Deployment:

Financial institutions cautiously deploy AI models for coding, documentation generation, and capital deployment. Despite acknowledging the benefits, they move cautiously due to the regulated nature of financial services.

Revolutionizing Coding:

Gary Marcus, a professor at New York University, notes generative AI’s revolutionary impact on coding. While it remains error-prone, the tech sector benefits as programmers can troubleshoot errors efficiently. However, hallucinations pose serious challenges in other business sectors.

Caution in Integration:

Executives emphasize a slow and deliberate approach when integrating generative AI into accuracy-sensitive areas. Cisco’s Vijoy Pandey highlights the need for technology, guidelines, and frameworks to protect against errors in critical business use cases, such as legal and security.


Please enter your comment!
Please enter your name here