It’s me, Smee

  • 0 Posts
  • 114 Comments
Joined 4 months ago
cake
Cake day: March 24th, 2025

help-circle



  • I’ve successfully ran small scale LLM’s on my phone, slow but very doable. I run my main AI system on an older, midrange gaming PC. No problems at all.

    Dicio is a pre-programmed assistant, which one can talk to if one has speech recognition software installed. It has a preset of tasks it can do, in my experience it’s quite incomparable to how LLM’s work.


  • It very much depends on your phone hardware, RAM affects how big models can be and CPU affects how fast you’ll get the replies. I’ve successfully ran 4B models on my 8GB RAM phone, but since it’s the usual server and client setup which needs full internet access due to the lack of granular permissions on Android (Even AIO setups needs open ports to connect to itself) I prefer a proper home server. Which, with a cheap GFX card, is indescribably faster and more capable.