Welcome to Incremental Social! Learn more about this project here! Check out lemmyverse to find more communities to join from here!
qupada , 5 months ago The estimated training time for GPT-4 is 90 days though. Assuming you could scale that linearly with the amount of hardware, you'd get it down to about 3.5 days. From four times a year to twice a week. If you're scrambling to get ahead of the competition, being able to iterate that quickly could very much be worth the money.
The estimated training time for GPT-4 is 90 days though.
Assuming you could scale that linearly with the amount of hardware, you'd get it down to about 3.5 days. From four times a year to twice a week.
If you're scrambling to get ahead of the competition, being able to iterate that quickly could very much be worth the money.