Share My Creation B4X AI Assistant

If you're not familiar with Ollama4j, first read this post to set up the Ollama server and install one of the models:

Currently, there are 5 options: Raw Text, Find Bugs, Refactor, Optimize, and Explain. You can add more if you want, source code is attached to this post.
I tested with different models, and for B4X, the best result is given by the qwen2.5-coder:14b model, which provides a fairly satisfactory result.
Since there are many different models, I haven't tested all of them, so it’s quite possible that some other model gives a better result.
The 14B model is the maximum I can run on my PC (Nvidia 1080TI with 11GB RAM and Ryzen 3950X with 32GB RAM) while maintaining decent speed; larger models probably give significantly better results.


1736587875860.png 1736591509721.png
 

Attachments

  • B4xAIAssistant.zip
    5.1 KB · Views: 20
Last edited:
Top