BACK_TO_FEEDAICRIER_2
Whisper benchmarks AVX-512 on i5-1135G7
OPEN_SOURCE ↗
REDDIT · REDDIT// 24d agoBENCHMARK RESULT

Whisper benchmarks AVX-512 on i5-1135G7

A Reddit user asks whether Whisper gets a meaningful CPU-speed boost on Intel’s i5-1135G7, which officially supports AVX-512. The real question is how much of Whisper’s transcription time comes from SIMD support versus model choice, quantization, and laptop thermals.

// ANALYSIS

Hot take: AVX-512 is worth testing here, but it probably won’t be a magic switch. For Whisper workloads, runtime tuning and model settings usually move the needle first, with vector width helping after that. Intel lists AVX-512 support for the i5-1135G7, so this is a real optimization target, not a theoretical one. faster-whisper runs Whisper through CTranslate2, while whisper.cpp reports CPU feature flags and leans on quantized models, so both can benefit from optimized CPU paths. The poster’s baseline of 9 minutes for 60 minutes of audio on an i7-2600 suggests CPU-only Whisper is already practical; the newer chip should matter most if it improves both latency and efficiency. On a thin-and-light laptop, sustained transcription may be limited by thermals and power limits as much as by AVX-512 width.

// TAGS
whisperfaster-whisperwhisper-cppspeechbenchmarkopen-source

DISCOVERED

24d ago

2026-03-18

PUBLISHED

24d ago

2026-03-18

RELEVANCE

7/ 10

AUTHOR

bidutree