1430271454931787781

August 24, 2021
"We have 1,000 times larger models requiring more than 1,000 times more compute, and that has happened in 2 years"

For comparison: GPT-3 > 175 billion parameters.

@CerebrasSystems CS-2 brain-scale chip can power #AI models with 120 trillion parameters https://t.co/DYKRuMBJv5 https://t.co/quKZhxPvm3