Russia Expands Military Influence in West Africa
Russia deploys 200 military instructors to Equatorial Guinea, tightening grip on West Africa and challenging Western influence.
Bizbooq
A recent report by the Associated Press has raised red flags about OpenAI's Whisper, a popular AI transcription tool used in various industries, including healthcare. Researchers have discovered that Whisper has a tendency to "hallucinate," introducing inaccurate information, including racial commentary and fictional medical treatments, into its transcripts.
The findings are alarming, especially considering Whisper's adoption in hospitals and medical contexts, where accuracy is paramount. A University of Michigan researcher found hallucinations in 80% of audio transcriptions, while a machine learning engineer detected inaccuracies in over half of the 100 hours of Whisper transcriptions they analyzed. Another developer reported finding hallucinations in nearly all 26,000 transcriptions created with Whisper.
OpenAI has responded, stating that they are "continually working to improve the accuracy of our models, including reducing hallucinations." The company's usage policies prohibit using Whisper in certain high-stakes decision-making contexts. However, the concerns raised by researchers highlight the need for caution and rigorous testing when relying on AI tools, particularly in critical industries like healthcare.
The implications of these findings are significant, and the tech community is watching closely to see how OpenAI addresses these concerns and ensures the accuracy of its transcription tool.
Russia deploys 200 military instructors to Equatorial Guinea, tightening grip on West Africa and challenging Western influence.
Cloud spending to reach $824 billion by 2025, driven by AI, multicloud, and edge computing adoption
Volkswagen's Scout Motors revives classic SUV brand as electric off-roader with $1 billion investment, targeting US market
Copyright © 2023 Starfolk. All rights reserved.