When the privacy page and the partnership page disagree
Two public pages. Two opposite claims.
Amira Learning’s privacy page makes a clear statement: “Amira Learning never shares student data with public AI platforms like ChatGPT, Claude, Gemini, or any other third-party foundation models.”
Their distribution partner, Pearson Canada, goes further. Their FAQ states the AI models used by Amira are “entirely internally developed, private models.”
Now go to claude.com/customers/amira.
What Anthropic’s own website says
That’s Anthropic’s own customer story page. It describes, in detail, how Amira uses Claude (Anthropic’s large language model) to generate comprehension dialogues, questions, hints, response pathways, word definitions, and custom rubrics. Amira’s Chief AI Scientist, Ran Liu, is quoted explaining how they “conducted extensive benchmarking across a variety of commercial and open-source models and found that Claude performed exceptionally well.”
Both of these things cannot be true at the same time.
The architectural nuance
To be fair, there’s a real architectural distinction here. Amira appears to use Claude offline to pre-generate content, which is then reviewed by humans and served to students at runtime through a reinforcement learning system. Students don’t interact with a live LLM during sessions. That’s meaningfully different from ChatGPT-style real-time generation, and it’s a legitimate design choice.
But the claim that Amira’s models are “entirely internally developed” is directly contradicted by Anthropic’s own published customer story. And the privacy page’s mention of not sharing data with “Claude” is confusing at best when the company has a documented partnership with the maker of Claude.
Why this matters for procurement
This matters because school districts and state education departments are making procurement decisions based on these representations. When a district asks “does this tool use generative AI?” and the vendor says no, that answer shapes everything from data privacy reviews to parental consent processes. If the real answer is “yes, but offline and reviewed,” then that’s the answer districts deserve to hear.
I spent 20 years in international schools. I’ve sat in procurement meetings. I know how these decisions get made. Administrators don’t read customer story pages on AI company websites. They read the vendor’s own materials, take them at face value, and check the box.
The box should not be this easy to check.