The process is less complicated than using Zoom:
- Install https://lmstudio.ai or similar app
- Select a model and download
The real bottleneck is hardware. I could only use upto 13B models. Results were worse than GPT 3.5 and slow.
The future of private AI depends on your ability to buy expensive hardware from Nvidia and other Big Tech hardware manufacturers. Without that, individuals will have to give up their privacy for closed source censored models on cloud or fall behind in productivity.
Even the process of fine tuning AI has become extremely easy from where it used to be. I would not say a child could pickup the process. If you have some experience working on the terminal and have basic familiarity about AI, there are solutions already setup by experts.
** Your post has been upvoted (3.40 %) **
Curation Trail is Open!
Join Trail Here
Delegate more BP for bigger Upvote + Daily BLURT 😉
Delegate BP Here
Upvote
https://blurtblock.herokuapp.com/blurt/upvote
Thank you 🙂 @tomoyan