- cross-posted to:
- hackernews@lemmy.smeargle.fans
You must log in or # to comment.
They’re using fine-tuning on top of existing models. Still useful, but the headline implies creating foundation models, which is still the domain of big iron.
And yet, I still won’t
“Can”
The fuck does 70B mean? 70 bytes?
70 billion parameters. It’s standard notation in the context of LLMs.