The Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 12 days agoSeems legitmedia.piefed.worldimagemessage-square54fedilinkarrow-up1543arrow-down15
arrow-up1538arrow-down1imageSeems legitmedia.piefed.worldThe Picard Maneuver@piefed.world to Lemmy Shitpost@lemmy.worldEnglish · 12 days agomessage-square54fedilink
minus-squareUriel238 [all pronouns]@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up7arrow-down1·12 days agoOffline LLMs exist but tend to have a few terabytes of base data just to get started (e.g. before LORAs)
minus-squarenomorebillboards@lemmy.worldlinkfedilinkarrow-up9·12 days agoI thought it was more like 10-20GB to start out with a usable (but somewhat stupid) model. Are you confusing the size of the dataset with the size of the model?
Offline LLMs exist but tend to have a few terabytes of base data just to get started (e.g. before LORAs)
I thought it was more like 10-20GB to start out with a usable (but somewhat stupid) model.
Are you confusing the size of the dataset with the size of the model?