B4J Library Ollama4j library - Pnd_Ollama4j - Your local offline LLM like ChatGPT

What is Ollama?
Ollama is a free and open-source project that lets you run various open source LLMs locally.
For more info check this link.

Ollama for Windows and system requirements

Download Ollama server

Top LLMs for Coding All Developers Should Know About

Short video (3 minutes) about Ollama on You Tube

When you finish Ollama server installation, you can run example app to get model of your choice.
You can chat from application, download and delete models, for now.
When you choose model and tag, check DOWNLOAD SIZE, right bottom on image below, some models are over 200GB.
You can start with MODEL: mistral and TAG: 7b, download size is 4.1GB

Hardware requirements:
RAM: 8GB for running 3B models, 16GB for running 7B models, 32GB for running 13B models
Disk Space: 5GB for installing Ollama, additional space required for storing model data, depending on the models you use.
CPU: Any modern CPU with at least 4 cores is recommended, for running 13B models a CPU with at least 8 cores is recommended.
GPU(Optional): A GPU is not required for running Ollama, but it can improve performance, especially for running larger models. If you have a GPU, you can use it to accelerate training of custom models.


1736375081121.png


Chat example:
1736375286831.png



Pnd_Ollama4j

Author:
Author: Ollama4j - B4j Wrapper: Pendrush
Version: 0.10
  • Pnd_LibraryModel
    • Properties:
      • Description As String [read only]
      • LastUpdated As String [read only]
      • Name As String [read only]
      • Object As io.github.ollama4j.models.response.LibraryModel [read only]
      • PopularTags As List [read only]
      • PullCount As String [read only]
      • ToString As String [read only]
      • TotalTags As Int [read only]
  • Pnd_LibraryModelDetail
    • Properties:
      • Tags As List [read only]
  • Pnd_LibraryModelTag
    • Properties:
      • LastUpdated As String [read only]
      • Name As String [read only]
      • Size As String [read only]
      • Tag As String [read only]
      • ToString As String [read only]
  • Pnd_Model
    • Properties:
      • Model As String [read only]
      • ModelName As String [read only]
      • ModelVersion As String [read only]
      • Name As String [read only]
      • Size As Long [read only]
  • Pnd_Ollama4j
    • Events:
      • SyncStreamed (Text As String)
    • Functions:
      • DeleteModel (ModelName As String)
        Delete a model from Ollama server.
        ModelName – the name of the model to be deleted
      • GenerateAsync (Model As String, Prompt As String, Raw As Boolean) As Pnd_OllamaAsyncResult
        Model – The name or identifier of the AI model to use for generating the response.
        Prompt – The input text or prompt to provide to the AI model.
        Raw – In some cases, you may wish to bypass the templating system and provide a full prompt. In this case, you can use the raw parameter to disable templating. Also note that raw mode will not return a context.
      • GenerateSync (Model As String, Prompt As String, Raw As Boolean) As Pnd_OllamaResult
        Model – The name or identifier of the AI model to use for generating the response.
        Prompt – The input text or prompt to provide to the AI model.
        Raw – In some cases, you may wish to bypass the templating system and provide a full prompt. In this case, you can use the raw parameter to disable templating. Also note that raw mode will not return a context.
      • GenerateSyncStreamed (Model As String, Prompt As String, Raw As Boolean) As Pnd_OllamaResult
        Model – The name or identifier of the AI model to use for generating the response.
        Prompt – The input text or prompt to provide to the AI model.
        Raw – In some cases, you may wish to bypass the templating system and provide a full prompt. In this case, you can use the raw parameter to disable templating. Also note that raw mode will not return a context.
        Returns Text in event _SyncStreamed (Text As String)
      • Initialize (EventName As String, Host As String)
      • IsInitialized As Boolean
      • LibraryModelDetails (LibraryModel As Pnd_LibraryModel) As Pnd_LibraryModelDetail
        Fetches the tags associated with a specific model from Ollama library. This method fetches the available model tags directly from Ollama library model page, including model tag name, size and time when model was last updated into a list of LibraryModelTag objects.
        LibraryModel – the LibraryModel object which contains the name of the library model for which the tags need to be fetched.
        Returns: a List of LibraryModelTag objects containing the extracted tags and their associated metadata.
      • ListModels As List
        Lists available models from the Ollama server.
        Returns: a List of models available on the server
      • ListModelsFromLibrary As List
        Retrieves a list of models from the Ollama library.
        This method fetches the available models directly from Ollama library page, including model details such as the name, pull count, popular tags, tag count, and the time when model was updated.
        Returns: a List of LibraryModel
      • Ping As Boolean
        API to check the reachability of Ollama server.
        Returns: True if the server is reachable, False otherwise.
      • PullModel (ModelName As String)
        Pull a model on the Ollama server from the list of available models.
        ModelName – the name of the model.
    • Properties:
      • RequestTimeoutSeconds As Long [write only]
      • Verbose As Boolean [write only]
  • Pnd_OllamaAsyncResult
    • Properties:
      • CompleteResponse As String [read only]
      • HttpStatusCode As Int [read only]
      • IsAlive As Boolean [read only]
      • IsDaemon As Boolean [read only]
      • IsInterrupted As Boolean [read only]
      • IsSucceeded As Boolean [read only]
      • StreamPool As String [read only]
      • ToString As String [read only]
  • Pnd_OllamaModelType
    • Fields:
      • ALFRED As String
      • ALL_MINILM As String
      • BAKLLAVA As String
      • CODEBOOGA As String
      • CODELLAMA As String
      • CODESTRAL As String
      • CODEUP As String
      • DEEPSEEK_CODER As String
      • DEEPSEEK_LLM As String
      • DOLPHIN_MISTRAL As String
      • DOLPHIN_MIXTRAL As String
      • DOLPHIN_PHI As String
      • DUCKDB_NSQL As String
      • EVERYTHINGLM As String
      • FALCON As String
      • GEMMA As String
      • GEMMA2 As String
      • GOLIATH As String
      • LLAMA2 As String
      • LLAMA2_CHINESE As String
      • LLAMA2_UNCENSORED As String
      • LLAMA3 As String
      • LLAMA3_1 As String
      • LLAMA_PRO As String
      • LLAVA As String
      • LLAVA_PHI3 As String
      • MAGICODER As String
      • MEDITRON As String
      • MEDLLAMA2 As String
      • MEGADOLPHIN As String
      • MISTRAL As String
      • MISTRAL_OPENORCA As String
      • MISTRALLITE As String
      • MIXTRAL As String
      • NEURAL_CHAT As String
      • NEXUSRAVEN As String
      • NOMIC_EMBED_TEXT As String
      • NOTUS As String
      • NOTUX As String
      • NOUS_HERMES As String
      • NOUS_HERMES2 As String
      • NOUS_HERMES2_MIXTRAL As String
      • OPEN_ORCA_PLATYPUS2 As String
      • OPENCHAT As String
      • OPENHERMES As String
      • ORCA2 As String
      • ORCA_MINI As String
      • PHI As String
      • PHI3 As String
      • PHIND_CODELLAMA As String
      • QWEN As String
      • QWEN2 As String
      • SAMANTHA_MISTRAL As String
      • SOLAR As String
      • SQLCODER As String
      • STABLE_BELUGA As String
      • STABLE_CODE As String
      • STABLELM2 As String
      • STABLELM_ZEPHYR As String
      • STARCODER As String
      • STARLING_LM As String
      • TINYDOLPHIN As String
      • TINYLLAMA As String
      • VICUNA As String
      • WIZARD_MATH As String
      • WIZARD_VICUNA As String
      • WIZARD_VICUNA_UNCENSORED As String
      • WIZARDCODER As String
      • WIZARDLM As String
      • WIZARDLM_UNCENSORED As String
      • XWINLM As String
      • YARN_LLAMA2 As String
      • YARN_MISTRAL As String
      • YI As String
      • ZEPHYR As String
  • Pnd_OllamaResult
    • Properties:
      • HttpStatusCode As Int [read only]
      • Response As String [read only]
      • ResponseTime As Long [read only]
      • ToString As String [read only]

Wrapper is based on Ollama4j v1.0.90 from HERE.
This wrapper is in a very early stage of development, but you can try it.
Download library from: https://www.dropbox.com/scl/fi/ejbr...ey=77w3vc48q4re1bmw80pkw5m0p&st=hornosqn&dl=0
 

Attachments

  • Ollama4jExample.zip
    8.9 KB · Views: 12
Last edited:
Top