Pentaho Academy Beta site ..

Workflows

Let flows do the work ..

Workflows Setup

You have several options on how you wish implement the Agents Docker containers:

  • WSL + Docker Desktop on Windows - deploy containers in Windows

  • WSL + Docker - deploy containers in a Linux OS running as a subsystem on Windows

  • Linux + Docker - deploy containers on Linux

  • MacOS + Docker - deploy containers on MacOS

To prevent conflicts the AI stack containers will be deployed on: WSL + Windows Docker Desktop.

Container
Description

Local LLM (Large Language Model) serving with both CPU and GPU options

A web interface for interacting with Ollama models

Vector database for semantic search capabilities

Database for storing n8n workflows and data

A workflow automation platform that can connect to various services

Low-code platform for building AI workflows

  1. Create a GenAI directory & copy over the GenAI files.

cd
mkdir ~/GenAI
cd ~/GenAI
  1. Copy GenAI STACK files.

cd
cd ~/GenAI
cp -rvpi ~/Workshop--LLM/GenAI/* ~/Workshop--LLM/GenAI/.* .
ls -al

Deploy AI Stack

  1. Check you have the required resources.

watch nvidia-smi
GPU cards
  1. Deploy containers.

Deploy AI Stack

To make life easier, in the GenAI directory you will find 4 scripts:

  • Docker Compose configuration as docker-compose.yml

  • Environment template as .env.template

  • Setup script as setup.sh

  • GPU helper as configure-gpus.sh

The setup script will:

✅ Check for Docker and Docker Compose

✅ Install Ollama if needed

✅ Create all necessary directories

✅ Generate your .env file with secure random keys

✅ Create the external volume

✅ Pull all Docker images

✅ Start Ollama and pull AI models

✅ Start all services

✅ Test connections

  1. Run the automated setup.

cd \
cd GenAI

# Make scripts executable
chmod +x setup.sh configure-gpus.sh

# Run the main setup (this does everything automatically)
./setup.sh
Deploy AI Stack

2. Configure Your GPUs (Optional)

If you have multiple GPUs:

# Run the GPU configuration helper
./configure-gpus.sh

This will show your GPUs and let you choose an allocation strategy.

  1. Check Docker Desktop that the containers are up and running.

Docker Desktop - AI Stack

Was this helpful?