ChatGPT cites companies 70.4% of the time. Claude: 63.9%. Gemini: 57.6%. Same queries, same companies, different answers. Across 2,014 companies, I measured how each platform recommends brands. The divergence is bigger than I expected.
Want to see how each platform treats your brand? Run a free report at loamly.ai/check.
Platform Citation Rates: The Gap
Each Loamly brand report runs 48 queries across ChatGPT, Claude, Gemini, and Perplexity. For each query, I check whether the platform cites the company. Here are the averages across 2,014 companies:
| Platform | Avg Citation Rate | Std Deviation |
|---|---|---|
| ChatGPT | 0.704 | 0.124 |
| Claude | 0.639 | 0.202 |
| Gemini | 0.576 | 0.308 |
ChatGPT is the most generous recommender and the most consistent (lowest std deviation). Gemini is the stingiest and the most unpredictable (highest std deviation at 0.308). Claude sits in the middle.
The practical impact: a company could score well on ChatGPT and poorly on Gemini for the same queries. Optimizing for one platform does not guarantee visibility on the others.
Why They Disagree
Three factors drive the divergence:
Training data differs. ChatGPT is trained on a broad web corpus with Bing search integration. Claude is trained with heavy emphasis on long-form content and Wikipedia. Gemini draws from Google's search index and Knowledge Graph. Each model "learned" about brands from different sources, so they have different knowledge.
Recency windows differ. ChatGPT with web browsing pulls current search results. Claude's knowledge has a fixed training cutoff. Gemini integrates live Google Search. A brand that launched in 2024 might appear in ChatGPT and Gemini but not in Claude.
Recommendation style differs. ChatGPT tends to be generous with recommendations. Ask "best CRM for startups" and it might name 5-7 options. Claude is more conservative, often naming 3-4. Gemini is the most selective and the most variable.
Wikipedia Matters More to Claude
I checked citation rates with and without Wikipedia presence:
| Platform | With Wikipedia (Citation Rate) | Without Wikipedia | Relative Boost |
|---|---|---|---|
| ChatGPT | 0.761 | 0.692 | +10.0% |
| Claude | 0.728 | 0.622 | +17.0% |
| Gemini | 0.683 | 0.555 | +23.1% |
Claude shows the strongest Wikipedia effect in absolute terms. Companies with Wikipedia pages get cited 17% more often by Claude. This makes sense. Claude's training data weights Wikipedia and long-form reference content heavily.
Gemini shows the largest relative boost (23.1%) but from a lower base. Without Wikipedia, Gemini only cites 55.5% of the time. That is a significant gap.
The Consistency Problem
The standard deviations tell an important story:
- ChatGPT (0.124): Relatively consistent. If ChatGPT recommends you for one query, it probably recommends you for similar queries.
- Claude (0.202): Moderately variable. Recommendations shift more between query types.
- Gemini (0.308): Highly variable. The same company might be cited in one query and completely absent in a similar one.
For businesses, this means Gemini visibility is harder to predict and harder to optimize for. You might see great results one month and different results the next, even if nothing changed on your end.
Platform-Specific Traffic Detection
The platforms also differ in how they send traffic, which matters for detection:
| Platform | Total Visits (Our Data) | Unique Visitors | Sends RFC 9421 Signatures |
|---|---|---|---|
| ChatGPT | 8,874 | 3,578 | Yes (100 verified) |
| Perplexity | 522 | 247 | No |
| Claude | 453 | 140 | No |
| Gemini | 2 | 2 | No |
ChatGPT dominates actual traffic delivery. It sends 17x more visits than Perplexity and 20x more than Claude. Gemini sends almost nothing.
ChatGPT is also the only platform that cryptographically signs some requests using RFC 9421 Ed25519 signatures. This lets Loamly verify with mathematical certainty that a visit came from ChatGPT. No other platform does this yet.
But 80%+ of ChatGPT traffic still arrives without signatures. The mobile app, web UI clicks, and API integrations strip referrer headers. This traffic becomes "dark AI traffic," invisible to GA4. See 80% of Your AI Traffic Is Invisible for the full analysis.
What This Means for Your Strategy
You cannot optimize for just one platform. A company that scores well on ChatGPT (70% citation rate) might score poorly on Gemini (40%). If your customers use multiple AI assistants, you need visibility across all of them.
The data suggests a multi-platform approach:
- Build brand authority (helps across all platforms). See Brand Authority Is 26x More Important.
- Get a Wikipedia page if you meet notability guidelines (biggest boost for Claude and Gemini).
- Build Reddit presence (strong signal for ChatGPT's training data).
- Track per-platform visibility with Loamly's check tool to see where you are strong and where you are weak.
Methodology
Data source: 2,014 Loamly brand reports (completed as of Feb 15, 2026).
Citation measurement: Each report runs 48 queries across ChatGPT, Claude, Gemini, and Perplexity. Citation rate = (queries where company was cited) / (total queries for that platform).
Traffic data: 446,405 visits tracked across all Loamly customer websites.
Limitation: Citation rates measure how often a platform mentions a company in response to standard category queries. They do not measure the quality or sentiment of the mention. A company could be cited as a negative example and still count as a citation.
Next Steps
- See your per-platform visibility: loamly.ai/check. The report breaks down ChatGPT, Claude, Gemini, and Perplexity separately.
- Read the benchmark: 2,014 Companies, 85.7% Invisible
- Track your AI traffic by platform: How to Track AI Traffic in 2026
No marketing spin. Just real data about your AI visibility.
Last updated: February 10, 2026
Stay Updated on AI Visibility
Get weekly insights on GEO, AI traffic trends, and how to optimize for AI search engines.
No spam. Unsubscribe anytime.
Related Articles
80% of Your AI Traffic Is Invisible. Here's What GA4 Misses.
14,413 dark AI visits vs 6,015 visible. Dark AI converts at 10.21%. GA4 dumps it into 'Direct.' Real detection data.
How to Find Hidden AI Traffic in Google Analytics (4 Methods)
GA4 misses 2.4x more AI traffic than it shows. Four practical methods to estimate dark AI hiding in your Direct bucket.
Check Your AI Visibility
See what ChatGPT, Claude, and Perplexity say about your brand. Free, no signup.
Get Free Report