BACK_TO_FEEDAICRIER_2
Shield 82M drops for multilingual PII filtering
OPEN_SOURCE ↗
REDDIT · REDDIT// 4h agoMODEL RELEASE

Shield 82M drops for multilingual PII filtering

LH-Tech-AI has released Shield 82M, an open-source, 82-million parameter finetune of distilroberta-base designed to strip PII from text across multiple languages with 96% accuracy. The model handles names, emails, phones, and addresses out-of-the-box.

// ANALYSIS

A fast, dedicated 82M parameter model for PII redaction is highly practical for preprocessing sensitive data before feeding it into larger LLMs.

  • Small parameter count makes it incredibly fast and cheap to run locally or at the edge
  • Multilingual support solves a major headache for global applications dealing with mixed-language user input
  • Finetuning distilroberta-base provides a solid foundation for robust text classification tasks like NER/PII extraction
  • Open-source availability enables teams to self-host and ensure data never leaves their infrastructure
// TAGS
shield-82mllmfine-tuningopen-weightsopen-sourcedata-tools

DISCOVERED

4h ago

2026-04-25

PUBLISHED

6h ago

2026-04-25

RELEVANCE

8/ 10

AUTHOR

LH-Tech_AI