SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · edit-22 days agoWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?message-squaremessage-square15fedilinkarrow-up132file-text
arrow-up132message-squareWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · edit-22 days agomessage-square15fedilinkfile-text
minus-squarewraekscadu@vargar.orglinkfedilinkarrow-up2·2 days agoDepends on how big your local LLM model is. You can easily run 2-3B param models on a potato. BUT, the model is shit.
Depends on how big your local LLM model is. You can easily run 2-3B param models on a potato. BUT, the model is shit.