Broadcom and CAMB.AI Launch Real-Time Audio Translation Chipset
Full Transcript
Broadcom has announced a groundbreaking partnership with CAMB.AI to introduce a new chipset capable of real-time audio translation directly on devices. This system on a chip, or SoC, is designed to handle translation, dubbing, and audio description tasks locally, eliminating the need for cloud processing.
This innovation is set to enhance accessibility for consumers, promising ultra-low latency and improved privacy since all data processing occurs on the user's device. Additionally, the technology could significantly reduce wireless bandwidth usage.
In a demonstration, the AI showcased its ability to provide audio descriptions in multiple languages using a clip from the film Ratatouille, where the AI narrates the scene while providing written translations on-screen, potentially benefiting individuals with vision impairments.
However, it's important to note that the demonstration was controlled, and the real-world performance and accuracy of the technology remain uncertain. The voice model utilized has been adopted by organizations such as NASCAR, Comcast, and Eurovision, which adds to its credibility.
The chipset is claimed to support over 150 languages for on-device translation, although details regarding its deployment into consumer electronics, like TVs, are still pending. The technology is currently in the testing phase, indicating that it may take some time before it becomes widely available.
Broadcom has also recently collaborated with OpenAI to assist in manufacturing AI chips, highlighting the company's focus on advancing AI technology.