Skip to main content
South Korea
AIMenta
foundational · Machine Learning

Transfer Learning

Taking a model trained on one task and adapting it for another, leveraging learned representations to reduce training cost and data needs.

Transfer learning starts from a model already trained on a large source task — ImageNet for vision, web-scale text for language, millions of audio clips for speech — and adapts it to your specific downstream task with a fraction of the data and compute a from-scratch approach would demand. The pretrained weights carry **general features** (edges and textures for vision, syntactic and semantic patterns for language) that the downstream task can reuse rather than relearn.

The practical menu runs from lightest to heaviest touch: **linear probing** (freeze the backbone, train a new classifier head — cheapest, works when the source and target are similar), **fine-tuning** (update all weights with a small learning rate — more expensive but more flexible), **parameter-efficient fine-tuning** (LoRA, QLoRA, adapters — the modern default for LLMs; train a tiny fraction of parameters for near-full-finetune quality), and **pretraining-from-scratch on your domain** (rarely justified; the economics almost never work below hyperscaler budgets).

For APAC mid-market, transfer learning is usually the correct answer to "should we train a model?" For 95% of real-world tasks, the question is not whether to pretrain from scratch — the answer is no — but which pretrained model to start from and how to adapt it. The defaults: a multilingual sentence-transformer for semantic search, a multilingual LLM (Llama-3, Qwen, Gemma) plus LoRA for domain-specific generation, a CLIP variant for image-text retrieval, a Whisper variant for speech.

The unsung cost of transfer learning is **evaluation drift**. A model that transferred well to your task last quarter can underperform after the foundation-model vendor ships a new base; re-running the adaptation is cheap, but noticing you need to is not. Wire an eval harness the day you ship.

Where AIMenta applies this

Service lines where this concept becomes a deliverable for clients.

Beyond this term

Where this concept ships in practice.

Encyclopedia entries name the moving parts. The links below show where AIMenta turns these concepts into engagements — across service pillars, industry verticals, and Asian markets.

Continue with All terms · AI tools · Insights · Case studies