Slider

BSNL Powers Up with AI, Developing Indigenous Small Language Models

State-run telecom giant builds in-house AI models to enhance 5G services and customer experience
BSNL Powers Up with AI, Developing Indigenous Small Language Models

BSNL’s Chairman and Managing Director, Robert J. Ravi, has confirmed that the state-run telecom operator is developing in-house small language models (SLMs) to strengthen its AI capabilities. The move is part of BSNL’s broader strategy to leverage artificial intelligence for network management, customer experience, and churn reduction.

Key Highlights

  • AI-driven transformation: BSNL is banking on AI to optimize telecom solutions, improve service delivery, and enhance customer satisfaction.
  • In-house SLMs: Instead of relying solely on external large language models, BSNL is building its own smaller, domain-specific models tailored for telecom operations.
  • Strategic focus areas:
    • Network management – predictive maintenance, traffic optimization.
    • Customer experience – personalized support, faster resolution.
    • Churn reduction – analyzing usage patterns and proactively addressing customer concerns.
  • Leadership vision: Ravi emphasized that AI workloads will dominate the telecom sector in the 5G era, making indigenous AI development critical for BSNL’s competitiveness.
This signals BSNL’s intent to position itself not just as a telecom provider but as a tech-driven enterprise, aligning with India’s push for indigenous AI stacks and digital sovereignty.

Here’s a detailed breakdown of how BSNL’s Small Language Models (SLMs) differ from traditional Large Language Models (LLMs), and why this distinction matters in telecom:

Key Differences Between SLMs and LLMs

Aspect Large Language Models (LLMs) Small Language Models (SLMs) Why BSNL Prefers SLMs
Scale Billions of parameters, massive datasets Fewer parameters, domain-specific training Easier to train and deploy for telecom-specific tasks
Resource Needs Require huge compute power, GPUs, cloud infrastructure Lightweight, can run on local servers or edge devices Cost-effective for a state-run telco with limited budgets
Speed & Efficiency Slower inference, high latency Faster response times, optimized for real-time tasks Critical for customer support and network management
Generalization Broad knowledge across domains Narrow focus on telecom operations Better accuracy for BSNL’s internal use cases
Deployment Cloud-heavy, centralized Edge-friendly, can be embedded in telecom infrastructure Supports 5G rollout and localized AI workloads
Cost Expensive to train and maintain Lower training and operational costs Aligns with BSNL’s need for affordable innovation

Why This Matters for BSNL

  • Telecom-specific optimization: SLMs can be fine-tuned for tasks like predictive maintenance, call routing, and churn analysis.
  • Digital sovereignty: Developing in-house models reduces dependence on foreign AI providers.
  • Scalability in 5G era: Lightweight models can be deployed across BSNL’s vast network infrastructure without overwhelming resources.
  • Customer experience: Faster, domain-specific AI responses improve support and reduce churn.
In short, BSNL’s SLMs are leaner, cheaper, and more focused, making them a strategic fit for telecom operations, whereas LLMs are powerful but resource-heavy and overly broad for BSNL’s needs.
Like this content? Sign up for our daily newsletter to get latest updates. or Join Our WhatsApp Channel
0

No comments

both, mystorymag

Market Reports

Market Report & Surveys
IndianWeb2.com © all rights reserved