AI Optimizer & Toolkit: set up a local sandbox

The Oracle AI Optimizer and Toolkit is a free and open-source tool designed to make it easier for developers and data engineers to build, benchmark, and optimize AI workflows running on Oracle Database.

It provides a modular framework for experimenting with model selection, prompt engineering, agents (tool calling), retrieval-augmented generation (RAG), and hybrid query optimization.

In this article, we’ll set up a local, containerized test environment for the AI Optimizer and Toolkit – enabling you to develop and test AI with Oracle Database without writing a single line of SQL.

Prerequisites:

  • Docker-compatible container runtime
  • docker-compose utility for managing Oracle Database Free and AI Optimizer containers.

Download and build the AI Optimizer container

First, clone or download the source code from GitHub. Then, Build the optimizer image from the root of the cloned repository:

git clone https://github.com/oracle/ai-optimizer.git
cd ai-optimizer
docker build -f src/Dockerfile -t ai-optimizer-aio .

Configure a minimal docker-compose.yml script for Oracle Database Free and the optimizer image:

services:
  ai-optimizer:
    image: ai-optimizer-aio:${TAG:-latest}
    container_name: ai-optimizer
    ports:
      - 8501:8501

  oraclefree:
    image: gvenzl/oracle-free:23.26.0-slim-faststart
    container_name: oraclefree
    ports:
      - 1521:1521
    environment:
      - ORACLE_PASSWORD=Welcome12345
    volumes:
      - ./oracle:/container-entrypoint-initdb.d
    healthcheck:
      test: ["CMD-SHELL", "lsnrctl status | grep READY"]
      interval: 15s
      timeout: 10s
      retries: 5
      start_period: 30s

Create an oracle directory and place the following grant_permissions.sql script inside it:

-- Set as appropriate for your database.
alter session set container = freepdb1;

create user testuser identified by testpwd;
grant create session to testuser;
grant unlimited tablespace to testuser;
grant connect, resource to testuser;

Then, start the containers with docker-compose:

docker-compose up -d

Once the containers start, navigate to the AI Optimizer and Toolkit Streamlit UI in your browser at http://localhost:8501/config:

AI Optimizer and Toolkit configuration settings interface with options for client settings, database settings, and download settings.

Configure the AI Optimizer with Oracle Database Free

We’ll now configure the optimizer container so it’s connected to the Oracle Database Free container.

AI Optimizer and Toolkit UI interface showing the configuration settings for the database connection.

From the optimizer config window, select the “Databases” tab and enter the following information:

  • Database User: testuser
  • Database Password: testpwd
  • Database Connect String: oraclefree:1521/freepdb1

Then, click “Save Database”. You should see the message “Current Status: Connected” displayed below the database configuration.

Database configuration interface showing current database settings, user credentials, and connection status.

The database container is now ready to use for AI development! Check out the next article, Connect LLMs and use your data, for an example of LLM configuration and RAG with private data.

References

Responses

  1. jai Avatar

    thanks for sharing this, i am able to add database but getting error while enabling llama3.1 model, although its running locally on port 11434. getting below error message – Failed to edit model: Model: Unable to update. API URL is inaccessible., but when i check it using curl, its saying model is running.

    1. Anders Swanson Avatar

      Hi Jai,

      Are you running Llama3.1 outside the docker network where the AI Optimizer is running? If so, you could try connecting via host.docker.internal:11434. Alternatively, you could:
      A) run the AI Optimizer stand-alone (not in a container), OR
      B) run Llama3.1 as a container from the provided docker-compose.yml script.

      Hope this helps!

      As a side note, I am planning to create a follow-up post that demonstrates how to connect to various LLMs from the sandbox envrionment described here.

      1. Jai Avatar

        Yeah, llama3.1 is running in mac and optimiser in docker network, will try your recommendation and see, thanks again.

Leave a Reply

Discover more from andersswanson.dev

Subscribe now to keep reading and get access to the full archive.

Continue reading