Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
fancyfredbot
on June 7, 2023
|
parent
|
context
|
favorite
| on:
OpenLLaMA 7B Training Completed to 1T Tokens
This is great. Based on the throughout of 2200 tokens/sec and the 1,000,000,000,000 tokens used to train this was at least $183k worth of compute (that's based on the three year committed use rate). And now we can have it for free!
thawab
on June 7, 2023
[–]
The price for training their 7B, as stated by MosaicML[0] and Falcon 7B, is roughly the same.
[0]
https://twitter.com/MosaicML/status/1660738892306485248
Join us for
AI Startup School
this June 16-17 in San Francisco!
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: