BACK_TO_FEEDAICRIER_2
Marul V7 launches Turkish-first LLM
OPEN_SOURCE ↗
REDDIT · REDDIT// 4h agoMODEL RELEASE

Marul V7 launches Turkish-first LLM

Marul V7 is a 258M-parameter Turkish language model trained from scratch with a custom tokenizer, custom architecture, Turkish pretraining data, and instruction tuning. It is available through Marul AI’s web app and Android app, but the creator frames it as an experimental local-language model rather than a GPT-class assistant.

// ANALYSIS

Marul V7 is small, rough, and probably not production-ready, but that is also why it is interesting: it is a from-scratch Turkish LLM effort instead of another wrapper around a global model.

  • The custom tokenizer and architecture claim matters more than the parameter count, because Turkish morphology often exposes weaknesses in generic multilingual models.
  • At 258M parameters, expectations should stay modest: basic chat, summarization, markdown, and simple coding are plausible; strong reasoning and deep knowledge are not.
  • The launch is more useful as a community feedback loop than a benchmark event, especially since public evals, weights, datasets, and training details are not yet provided.
  • The web app also lists Llama 3 70B alongside Marul V7, so users should be careful about which model they are actually testing.
// TAGS
marul-v7marul-aillmchatbotfine-tuninginference

DISCOVERED

4h ago

2026-04-22

PUBLISHED

6h ago

2026-04-22

RELEVANCE

7/ 10

AUTHOR

Marul_AI