From the course: OpenAI API: Building Front-End Voice Apps with the Realtime API and WebRTC

Unlock this course with a free trial

Join today to access over 24,800 courses taught by industry experts.

Adding visualizations with the Web Audio API

Adding visualizations with the Web Audio API

- [Instructor] When you build an audio-based interface, it's best practice to provide some sort of visual indicator that audio is playing. That way the user knows that something is happening, even if their audio isn't working properly, and they'll be able to troubleshoot. In this basic reference implementation, there's no such indicator, so the user has no idea if audio is playing or if their mic is working. That's why I built this second example audio visualizer. Here we have two visualizers, one for the AI and one for the microphone to show the user exactly what is happening. If I create this connection you'll see it in action. - [Computer] Hello, how can I assist you today? - [Instructor] Can you write a haiku about a duck? - [Computer] Duck glides on pond, feathers- - [Instructor] Okay, so what you saw here looked really advanced, but what's actually happening is I'm hooking into the web audio API. This is an API that exists in all modern browsers that allows a webpage to grab any…

Contents