A new OS LLM making big waves in coding and development

https://x.com/MiniMax__AI/status/1982674798649160175

Available for free on OpenRouter: MiniMax M2 (free) - API, Providers, Stats | OpenRouter
I use it with Kilo on VS Code.

Did you try it? i’m curious to know if it’s really good
I already used many llms, claude-x-opus, codex, gemini, all them via cursor, terminal and vscode. For me the best still is the standard vscode copilot (using gpt 5 model)

MiniMax in OpenRouter it is really fast, maybe slower than others for the first token but once the first comes … looks like the fastest I used so far.
For day-to-day I use DeepSeek R1 Chimera for both code and reasoning/planning/architecture because it seems generally better with Kilo, more straight to the point.

For a free (for now) service I think it is great. I still prototype a lot and consume millions of tokens daily with these massive prompts in Kilo. I like free LLMs :).