Sakana AI launches free Sakana Chat, but only Japan gets to try it
Sakana AI just launched a free Japanese chatbot with built-in web search, a playful Osaka dialect mode, and models specifically trained to remove Western biases. You probably can’t access it.
Tokyo-based Sakana AI launched Sakana Chat today, a free public chatbot with built-in web search and fast response times. It is geo-restricted to Japan. Overseas users get a block.
- The modes: Standard, polite, or “Osaka” — a Kansai dialect mode that early testers called the highlight, describing it as cheeky and surprisingly natural.
The chatbot runs on a new family of models called Namazu α, built by post-training existing open-weight models including DeepSeek V3.1, Llama 3.1 405B, and GPT-OSS 120B.
- The point of Namazu: Global models carry biases baked in from training data and developer moderation policies. Sakana’s post-training strips those out and replaces them with more neutral, Japan-appropriate values.
- The result in practice: One base model refused 72% of questions about Japan-related sensitive topics like politics and history. Namazu versions drop that near zero, giving balanced answers instead.
Sakana AI was founded in 2023 by former Google Brain research director David Ha alongside Llion Jones, one of the original Transformer paper authors. The company has raised around $335M total, most recently at a $2.65B valuation, with investors including NVIDIA, Khosla, and Alphabet.
- Still early: Alpha release, Japan-only, no app, no voice, not open-sourced yet.
- The irony: A chatbot built specifically to serve Japanese users is also the most interesting thing to come out of the Japanese AI scene in a while — and nobody outside Japan can try it.
The Bottom Line: While everyone else races to build the AI that does everything for everyone, Sakana built one specifically for Japan, stripped out the cultural noise other models can’t be bothered to address, and Japanese users are responding to it. Sometimes focus is the product.
If you need on-demand GPUs for training, fine-tuning, inference, or running open-source models, give RunPod a try.
- Available hardware: H100, H200, A100, L40S, RTX 4090, RTX 5090, and 30+ more
- Cost: significantly cheaper than AWS or GCP, billed per second, no contracts
- Setup: spins up in under a minute, 30+ regions worldwide

Get the core business tech news delivered straight to your inbox. We track AI, automation, SaaS, and cybersecurity so you don't have to.
Just read what you want, and be done with it.





