Summary created by Smart Answers AI
In summary:
- Macworld reports that nearly 200 AI apps on the App Store expose sensitive user data through security vulnerabilities identified by CovertLabs’ Firehound project.
- The Chat & Ask AI app by Codeway exposed over 406,000 files containing user chats and personal information, highlighting significant privacy risks.
- Users should verify app security using Firehound before downloading and exercise caution when sharing personal data with AI applications.
AI apps are everywhere, and they sure seem like they can be incredibly useful, don’t they? However, users need to be mindful of AI slop, inaccuracies, and hallucinations–and it turns out a lot of AI apps are a security risk, as well.
A new project by AI security firm CovertLabs takes a look at AI apps in the App Store and indexes the apps that expose user data. The index, called Firehound, is available to view online and provides a tally of the files exposed by the app. Nearly 200 apps are listed in Firehound, with a large number of them still available in the App Store.
There are tons of image generators, chatbots, and photo animators, the exact kind of apps people would be searching for. The app with the most files exposed on Firehound’s registry is Chat & Ask AI by Codeway, a chatbot that has Deep Flow Software Services-FZCO listed as the seller. The app has exposed over 406 thousand files that include user chats and user information.
A January 20th X post by Harrris0n (whose bio includes a direct link to CovertLabs) states that the app’s “problem has been addressed, and the vulnerability no longer exists.” But according to the App Store, Chat & Ask AI is at version 3.3.8, which was released on January 7. Firehound’s registry for the app is dated January 15, 2026, so it does not appear that the fixed version has been made available to the public.
CovertLabs
The purpose of Firehound is to let developers know that breaches have been found in their apps so they can be fixed. When visiting Firehound, a “Responsible Disclosure” pop-up appears (see above) to provide developers a way to contact CovertLabs, learn how to fix the app, and have the app removed from the registry. Registration is required to access CovertLabs’ research and results.
Users can make good use of Firehound, as well. It can be used as a source to check the security of an AI app they may be considering in the App Store. How did these apps get onto the App Store with their security holes in the first place? That is unknown.
Firehound is a good reminder to users that all AI apps rely on personal information, and that users need to be aware of the data being provided and how much of it they are willing to expose. With AI being the new frontier, companies are quick to develop tools to stake a claim, but those tools may lack the proper security implementations.



