OpenAI launches GPT-4o, improving ChatGPT’s capabilities

This news has been read 985 times!

SAN FRANCISCO, May 16, (AP): OpenAI’s latest update to its artificial intelligence model can mimic human cadences in its verbal responses and can even try to detect people’s moods. The effect conjures up images of the 2013 Spike Jonze move “Her,” where the (human) main character falls in love with an artificially intelligent operating system, leading to some complications. While few will find the new model seductive, OpenAI says it does works faster than previous versions and can reason across text, audio and video in real time. GPT-4o, short for “omni,” will power OpenAI’s popular ChatGPT chatbot, and will be available to users, including those who use the free version, in the coming weeks, the company announced during a short live-streamed update.

CEO Sam Altman, who was not one of the presenters at the event, simply posted the word “her” on the social media site X. During a demonstration with Chief Technology Officer Mira Murati and other executives, the AI bot chatted in real time, adding emotion – specifi- cally “more drama” – to its voice as requested. It also helped walk through the steps needed to solve a simple math equation without first spitting out the answer, and assisted with a more complex software coding problem on a computer screen. It also took a stab at extrapolating a person’s emotional state by looking at a selfie video of their face (deciding he was happy since he was smiling) and translated English and Italian to show how it could help people who speak different languages have a conversation.

This news has been read 985 times!

Related Articles

Back to top button

Advt Blocker Detected

Kindly disable the Ad blocker

Verified by MonsterInsights