VentureBeat 2023/8/4 01:29:51(JST)
Silicon Valley is abuzz with speculation about which companies are gaining access to Nvidia’s highly sought-after and expensive H100 GPU for large language model (LLM) training. The demand for H100s is expected to continue until at least the end of 2024, with estimates suggesting that OpenAI may want 50,000 units and other companies like Inflection, Meta, and big cloud providers may also require tens of thousands. The shortage of GPUs is being compared to the battle for power in “Game of Thrones.” Efforts to optimize machine learning models and increase chip supply may be more effective for AI inference rather than LLM training.
(※本記事はAIによって自動的に要約されています。正確な情報は引用元の記事をご覧ください。)
(※画像はAIによって自動で生成されており、引用元とは関係ありません。)
引用元の記事:Nvidia GPU shortage is ‘top gossip’ of Silicon Valley.