OpenAI’s first open source language model since GPT-2

  • CyberSeeker@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    25 days ago

    Yes, but 20 billion parameters is too much for most GPUs, regardless of quantization. You would need at least 14GB, and even that’s unlikely without offloading major parts to the CPU and system RAM (which kills the token rate).