iFixScreens.com

OpenAI's ChatGPT unveils new human- 

like voice mode, now available to select Plus subscribers for testing.

How it works: The advanced Voice Mode works through a sophisticated AI model, where the user's voice input is converted into text using speech recognition technology.

Demonstration: At the event, OpenAI staff showed how the new voice mode can hold human-like conversations, adapt to group settings, and adjust to different conversation styles.

Voice Mode: While the older voice model relied on three separate models- one to convert voice to text, another to convert text to voice, and GPT-4 to process prompt.

How to use Voice Mode: Tap the voice icon next to the mic icon. On the next screen, mute/unmute with the mic icon or end the conversation using the red icon at the bottom right.

Tone of Voice Mode: The human-like tone of Advanced Voice Mode will still be a topic of conversation during this Alpha test. But the novelty and hype have been diminished by a nearly three-month wait.

User Experience: Those testing the service will judge Advanced Voice Mode based on its practical merits and whether it provides an authentic experience.

Availability: As of now, the Her-like voice mode may be available to a limited number of users through specific platforms. OpenAI plans to expand its availability based on user feedback.

Data Security: OpenAI prioritizes user privacy and data security in the Her-like voice mode. Interactions are handled with the highest standards of data protection to protect user privacy.