36
Views
2
Comments
A Guide to setting up your own Local AI and Integrating with Outsystems.

The Agents vs Zombies event is going great but our devs are running out of credits. No worries heres my guide to running up your own Local AI and Integrating with Outsystems with credit limits!!!

To do this we need to achieve 3 things

  • Run a local AI model
  • Expose an LLM end point
  • Connect Outsystems to the same

Run a local AI model

To do this we need a Model Runner Ollama is the lightest and easiest to setup.

Download and install the setup of your choice from the below URL

https://ollama.com/download/windows

A screenshot of a computer

AI-generated content may be incorrect.



 


Once setup open Ollama and select a model of your choice. I would start with gemma3:4b as it will fit in most laptops with 8 to 16 GB Ram.

A screenshot of a computer

AI-generated content may be incorrect.

Once Selected enter a sample query “ just respond Yes”

A screenshot of a computer

AI-generated content may be incorrect.

If this is the first time you are using Ollama or the model it will download and load the model for you.

Once you get a response it means your model is now ready for use

Now open the side bar and click on settings

A screenshot of a computer

AI-generated content may be incorrect.



Ensure the Expose Olamma to the network is enabled

A screenshot of a computer

AI-generated content may be incorrect.


Now your Ollama is set up and ready to be exposed as an end point.


 


Exposing an Endpoint

To do this we can use Ngrok that give you one free end point for development.

Go to the below URL signup and download your installer of choice.

https://ngrok.com/download/windows

Once Set up is done  go to the below url and get your authorisation token

https://dashboard.ngrok.com/get-started/your-authtoken

A screenshot of a computer

AI-generated content may be incorrect.





Go to https://dashboard.ngrok.com/domains

Click on + new domain and get a Free domain generated, copy and save your domain in a notepad

A screenshot of a computer

AI-generated content may be incorrect.

Now open power shell on windows

And enter the below command replace $YOUR_AUTHTOKEN with your actual token this will register the instance

ngrok config add-authtoken $YOUR_AUTHTOKEN

A screenshot of a computer

AI-generated content may be incorrect.



Now to initiate it you need to use the below command

ngrok http 11434 --host-header="localhost:11434"

A screenshot of a computer

AI-generated content may be incorrect.

On running the command if it all went well you should see the below screen

A screenshot of a computer

AI-generated content may be incorrect.

Now your LLM end point is ready!!



Connecting A custom LLM

Go to your personal ODC environment and click on AI models

 

Click on Add An AI model

 


Select a custom AI model

A screenshot of a computer

AI-generated content may be incorrect.


Add your Connection name and description

A screenshot of a computer

AI-generated content may be incorrect.


Then click on add endpoint

A screenshot of a computer

AI-generated content may be incorrect.

Enter your end point details as below


Name:

Any name you want

Model ID:

Use the Model id in Ollama here I am using “gemma3:4b”

URL:

Replace "your-domain-name" with your domain name

“https://your-domain-name/v1”




 


And click on test end point you should get a success message now click on save.

 


And now save the connection

A screenshot of a computer

AI-generated content may be incorrect.


Congratulations you have now connected your local LLM to Outsystems

A screenshot of a computer

AI-generated content may be incorrect.

After this the rest of the process is well explained in the link below by Bruno Martinh

https://www.linkedin.com/feed/update/urn:li:activity:7389263711133315072/

Happy Hunting!!!!

2025-05-31 09-56-11
TheSubuIyer
Champion

Heres the PDF version

A Guide to setting up your own Local AI and Integrating with Outsystems.pdf
2024-10-12 12-11-20
Kerollos Adel
Champion

Big effort , thanks for sharing 

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.