The Agents vs Zombies event is going great but our devs are running out of credits. No worries heres my guide to running up your own Local AI and Integrating with Outsystems with credit limits!!!
To do this we need to achieve 3 things
Run a local AI model
To do this we need a Model Runner Ollama is the lightest and easiest to setup.
Download and install the setup of your choice from the below URL
https://ollama.com/download/windows
Once setup open Ollama and select a model of your choice. I would start with gemma3:4b as it will fit in most laptops with 8 to 16 GB Ram.
Once Selected enter a sample query “ just respond Yes”
If this is the first time you are using Ollama or the model it will download and load the model for you.
Once you get a response it means your model is now ready for use
Now open the side bar and click on settings
Ensure the Expose Olamma to the network is enabled
Now your Ollama is set up and ready to be exposed as an end point.
Exposing an Endpoint
To do this we can use Ngrok that give you one free end point for development.
Go to the below URL signup and download your installer of choice.
https://ngrok.com/download/windows
Once Set up is done go to the below url and get your authorisation token
https://dashboard.ngrok.com/get-started/your-authtoken
Go to https://dashboard.ngrok.com/domains
Click on + new domain and get a Free domain generated, copy and save your domain in a notepad
Now open power shell on windows
And enter the below command replace $YOUR_AUTHTOKEN with your actual token this will register the instance
ngrok config add-authtoken $YOUR_AUTHTOKEN
Now to initiate it you need to use the below command
ngrok http 11434 --host-header="localhost:11434"
On running the command if it all went well you should see the below screen
Now your LLM end point is ready!!
Connecting A custom LLM
Go to your personal ODC environment and click on AI models
Click on Add An AI model
Select a custom AI model
Add your Connection name and description
Then click on add endpoint
Enter your end point details as below
Name:
Any name you want
Model ID:
Use the Model id in Ollama here I am using “gemma3:4b”
URL:
Replace "your-domain-name" with your domain name
“https://your-domain-name/v1”
And click on test end point you should get a success message now click on save.
And now save the connection
Congratulations you have now connected your local LLM to Outsystems
After this the rest of the process is well explained in the link below by Bruno Martinh
https://www.linkedin.com/feed/update/urn:li:activity:7389263711133315072/
Happy Hunting!!!!
Heres the PDF version
Big effort , thanks for sharing