Apple has introduced a new AI-powered tagging system to improve app discoverability on the App Store, now available in the developer beta of iOS 26.
While these AI-generated tags are not yet visible on the public App Store or influencing live search results, they mark a significant shift in how apps may be discovered shortly.
According to Apple, the system uses artificial intelligence, not traditional OCR, to extract context from screenshots, app descriptions, categories, and other metadata. This move was first announced at WWDC 2025, where Apple emphasized that developers no longer need to manually embed keywords into screenshots or descriptions to improve search visibility.
App intelligence firm Appfigures recently suggested that screenshot captions were impacting app search rankings, assuming OCR was involved. While their conclusion about screenshots playing a role was correct, Apple clarified that AI techniques are behind the tagging process, not OCR.
The AI-generated tags are designed to better categorize apps, potentially enhancing user discovery in the future. Developers will be able to view and manage these tags before they go live, with final approval reviewed by human moderators. While the system is still in beta, Apple is expected to roll it out publicly after further testing.
As the feature matures, understanding how these tags work — and which ones improve visibility — will become vital for developers aiming to boost app downloads.
Bijay Pokharel
Related posts
Recent Posts
Subscribe
Cybersecurity Newsletter
You have Successfully Subscribed!
Sign up for cybersecurity newsletter and get latest news updates delivered straight to your inbox. You are also consenting to our Privacy Policy and Terms of Use.