A virus app called Neon offers to record phones and pay for audio so that it can be sold to AI companies, but since its launch last week it has been rapidly rising to the rank of free iPhone apps.
App Intelligence Provider According to App Figures, the app already has thousands of users and has been downloaded 75,000 times yesterday alone. Neon pitches itself as a way for users to make money by providing call recordings that help them train, improve and test AI models.
But for now, neon is offline. A security flaw allowed anyone to access their phone numbers and call recordings and transcriptions of other users, which allows TechCrunch to report.
TechCrunch discovered a security flaw during a brief testing of the app on Thursday. We warned the app founder, Alex Kiam (who previously had not responded to requests for comment about the app) of the flaws immediately after its discovery.
Kiam told TechCrunch later Thursday that it had begun to notify users of app suspensions by defeating the app’s servers, but did not end up notifying users of security revocation.
The Neon app stopped working immediately after contacting Kiam.
Calling recordings and transcripts
The failure was the fact that the server in the Neon app didn’t prevent logged in users from accessing someone else’s data.
TechCrunch created a new user account on its dedicated iPhone and checked the phone number as part of the sign-up process. We used a network traffic analysis tool called Burp Suite to inspect the network data flowing through Neon apps, allowing us to understand how apps work at a technical level, including how they communicate with backend servers.
After making some test calls, the app showed us a list of our latest calls and the amount each call earned. However, our network analysis tools revealed details that are invisible to ordinary users of the Neon app. These details included the text-based transcripts of the call and the web address to the audio files.
For example, here you can see a transcript that confirms that the recording worked properly from a test call between two TechCrunch reporters.

But the backend server could also spit out other people’s call recordings and their transcripts.
In one case, TechCrunch discovered that the Neon server could create data about the latest calls made by the app’s users, providing public web links to the raw audio files and transcript text of what was said in the call. (The audio file contains recordings only by people who installed neon, not contacted.)
Similarly, you can interact with the Neon server to reveal the latest call records (also known as metadata) from one of the users. This metadata included the user’s phone number and phone number, as well as the amount each call earned over the period when the call was made.
Reviews of some transcripts and audio files suggest that some users may use the app to make long calls secretly recording real-world conversations with others to generate money through the app.
The app is shut down for now
Shortly after warning Neon about the defects on Thursday, the company founder Kiam sent an email to customers warning them of app shutdowns.
“Your data privacy is our number one priority and we want to make sure it’s completely safe even during this rapid growth period. For this reason, we’re temporarily removing the app and adding a layer of security.”
In particular, the email does not mention that security has expired, disclosure of a user’s phone number, call recordings, or calling transcripts to other users who know where to look.
It’s unclear when Neon will return online, or whether this security will attract the attention of the app store.
Apple and Google have not yet commented on TechCrunch’s outreach on whether Neon is compliant with their respective developer guidelines.
However, this is not the first time an app has appeared in the market with serious security issues. Recently, TEA, a popular mobile dating companion app, has experienced a data breaches and has published user personal information and government-issued identity documents. Popular apps like Bumble and Hinge exposed their users’ locations in 2024. Both stores need to periodically purge malicious apps that slip beyond the app review process.
When asked, Kiam didn’t immediately say whether the app had received a security review prior to launch, or, if so, whether it had run the review. Kiam also did not say if anyone else found a defect before us or if user data had been stolen when asked if the company had technical measures such as logs.
Additionally, TechCrunch reached out even more to Upfront Ventures and Xfund, who Kiam claims to have invested in his app on LinkedIn Post. Neither company responded to requests for comment at the time of publication.