Video Player is loading.
Current Time 0:00
Duration 0:00
Loaded: 0%
Stream Type LIVE
Remaining Time 0:00
 
1x
    • Chapters
    • descriptions off, selected
    • subtitles off, selected

      Domain Adaptation of Large Language Models for the Telco Industry

      , Researcher, Orange
      Discover a unique dataset that has been designed for Large Language Model fine-tuning in the Telco domain from a large variety of data sources. Learn about an efficient way to adapt a 7B large language models (LLMs) to a technical domain using a maximum of three NVIDIA A100 40GB GPUs through a performance comparison of different Domain Adaptation Pre-Training and Instruct Tuning methods for domain adaptation. LLMs must be adapted to specific domains, such as telco, to capture the intricate lexical, semantic, and concept-specific nuances. This adaptation enables them to proficiently manage technical documents, network modeling, and other critical use-cases demanded by telco operators, like Orange. Yet, as LLMs grow in complexity, their adaptation becomes increasingly costly. This presentation will examine the efficacy of various parameter-efficient fine-tuning methods for telco domain adaptation. We'll evaluate the best domain-adaptive pre-training techniques for tailoring a foundational model to a particular domain, using both intrinsic metrics like perplexity and domain-specific task evaluations.
      活动: GTC 24
      日期: March 2024
      NVIDIA technology: Cloud / Data Center GPU,RTX GPU
      级别: Intermediate Technical
      话题: Large Language Models (LLMs)
      行业: Telecommunications
      语言: English
      所在地: