• velox_vulnus@lemmy.ml
    link
    fedilink
    arrow-up
    22
    ·
    9 months ago

    You can’t run LLMs on your phone. You can however, host your own LLM, like Persimmon or Llama, and make use of an app that connects to the server side.

      • velox_vulnus@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        I am not an expert here, so I cannot help you extensively. I just like to play around with projects. There’s already tutorials out there for running a private LLM, so you might want to refer that.

        I’ve tried running Alpaca 7B on my laptop using Dalai, and let’s just say it toasted my device (i5 8265U, 8GB RAM, 25W MX250 2GB VRAM). I wouldn’t recommend Dalai as the front-end, the Node script is a mess and it also requires privilege escalation to run commands - there are probably some other better front-ends out there.

        Persimmon 8B is a slightly better model. Then there’s also Mistral 7B, but I don’t know much about that.

        You might want to run these models using Google Cloud’s Vertex AI.

        • cheese_greater@lemmy.worldOP
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          9 months ago

          Can you check out quickly and comment on SideMind (iOS)?

          Edit: hope this doesnt come across as too “do shit for me Spidey” :/

          • velox_vulnus@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            9 months ago

            I’ve checked the page you were talking about, and it shows how they have a bunch of different models, which is but an illusion of choice. I think what they’re basically doing over here is that they’re still using ChatGPT’s API, but they’re probably introducing some biases for the way these models talks, using specialized prompts that probably execute before you do the chatting (I think this is possible, but as I’ve never used their API, I cannot confirm this). So you’re chatting with the same bot, that has it’s mannerisms tweaked.

            Putting that aside, and if I’m assuming that they’re just like the rest of the other “AI” companies out there, they’re still using OpenAI’s service. The issue with this is that now you’re exposed to a middleman that also has hands over your personal chat, apart from OpenAI. If you want a private LLM, you must run it on a personal instance by yourself.

            A better option would be to look around on GitHub for someone’s project, make sure that their front-end has web-app support, pin the web-app onto your iPhone’s desktop I would urge you to check a few such projects on GitHub. Like for example, privateGPT, GPT4ALL (recommended) and Dalai (not recommended), which are completely open-source.

            Edit: Yup, they’re most probably using OpenAI’s API, read their privacy policy here.

    • Asudox@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Im not sure if using ChatGPTs website is what OP is asking for since OP said he wanted a ptivacy friendly way to use LLMs such as ChatGPT.

    • pandarisu@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      9 months ago

      I get personal choice, but if you are looking for privacy focused, wouldn’t a browser based option give the developer less information than installing an app?

      • cheese_greater@lemmy.worldOP
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        Aight, u talked me down lol. I suppose [a]i can use Friendly for it too as like a springboard for things where I want like the WebApp to be like an app and also a tad more optimized/slick than simple WPAs.