A Child Could Run A Private Local LLM

in r2cornell •  9 months ago 

The process is less complicated than using Zoom:

  1. Install https://lmstudio.ai or similar app
  2. Select a model and download

The real bottleneck is hardware. I could only use upto 13B models. Results were worse than GPT 3.5 and slow.

The future of private AI depends on your ability to buy expensive hardware from Nvidia and other Big Tech hardware manufacturers. Without that, individuals will have to give up their privacy for closed source censored models on cloud or fall behind in productivity.

Even the process of fine tuning AI has become extremely easy from where it used to be. I would not say a child could pickup the process. If you have some experience working on the terminal and have basic familiarity about AI, there are solutions already setup by experts.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE BLURT!
Sort Order:  
  ·  9 months ago  ·  


** Your post has been upvoted (3.40 %) **