Results 1 to 9 of 9

Thread: Dllama - Local LLM Inference

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Thanks for link . I'm afraid I won't be able to try this anyway. Despite the tempting foreign language capabilities. But I have a couple general questions.
    This runs on the Embarcardero Delphi?
    How much disc space is needed/recommended?

  2. #2
    Quote Originally Posted by Jonax View Post
    Thanks for link . I'm afraid I won't be able to try this anyway. Despite the tempting foreign language capabilities. But I have a couple general questions.
    This runs on the Embarcardero Delphi?
    How much disc space is needed/recommended?
    - Yes, Pascal (Delphi/FreePascal), C/C++ (C++ Builder, Visual Studio 2020, Pelles C)
    - ~5MB for the distro and then you will a model to use, the smalled is Phi3, which is ~2.5GB in size, most models that I can run in VRAM are from 4-8GB in size.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •