We’ve already got ChatGPT on our phones, but what about an actual ChatGPT phone with AI agents rather than apps? My mind immediately jumped to one issue: the monthly bill for those AI agents.
OpenAI’s hardware aspirations are no secret, what with its public alliance with ex-Apple design guru Jony Ive and previous talk of a ChatGPT smart speaker, earbuds, or perhaps even a context-aware “pebble” that you’d carry in your pocket. So it’s hardly surprising to hear OpenAI might be working on a ChatGPT phone; indeed, the real surprise would be if OpenAI wasn’t working on one.
The latest ChatGPT phone chatter is coming from tech analyst Ming-Chi Kuo, who believes OpenAI is collaborating with chipmakers MediaTek and Qualcomm on a smartphone chip, while component manufacturer Luxshare would focus on the hardware.
This ChatGPT smartphone wouldn’t just be a phone with baked-in ChatGPT, Kuo says. Instead, the phone might do away with apps altogether, relying instead on teams of AI agents that continually do your bidding. Gone would be the grid of apps, Kuo predicts; in its place, a stream of updates from your mobile agents, detailing everything from your most important emails and updates on the stock market to the status of an agent-deployed project or suggestions for dinner plans.
Of course, we already have the powerful ChatGPT mobile app, which is capable of performing in-app agentic tasks, deep research, and even location-aware searches. But a made-by-OpenAI phone would allow ChatGPT and its agents to escape the sandbox, giving them system-wide access to the handheld and enabling even greater access to our every tap and movement, Kuo predicts.
A hypothetical ChatGPT phone with agents would doubtless be optimized for smaller AI models capable of running on a mobile chip, which is precisely what Kuo’s talking about in terms of an OpenAI partnership with MediaTek and Qualcomm. With a custom mobile processor optimized for local AI, on-device agents could handle more mundane duties like triaging your email or suggesting the best nearby lunch spots, perfect for keeping agentic costs down while (ideally) maintaining your privacy.
But as Kuo notes, a ChatGPT phone would surely be able to tap into the cloud for more complex tasks, such as building presentation decks, creating realistic images, building intricate mobile workflows, or some other forthcoming killer application–and just like what happens with ChatGPT’s Codex or Claude Code, you’d then be dealing with AI usage limits, which would now extend to your phone.
That’s a whole new revenue stream for ChatGPT or any other AI provider who wants to make their own AI-powered phone. (Google has already taken baby steps in this direction with its Gemini-enhanced Pixel 10).
Now, I don’t mean to suggest we’ll literally be seeing AI-agent line items on our Verizon bills (although now that I think about it, I can’t rule it out). Instead, I envision new and more expensive ChatGPT plans with larger AI usage limits to accommodate your ChatGPT phone, or perhaps a discrete AI usage bucket for mobile.
However it ends up working, AI charges for a ChatGPT phone would swell our already bloated monthly mobile expenses, which already include data and cloud storage costs. And much like mobile data service is essential for an iPhone or Android phone, agentic AI service would be essential for a ChatGPT, Claude, or Gemini phone.
In other words, get ready to pay more. Maybe a lot more.



