A Practical Guide to Running an Agentic AI Assistant with Oracle AI Database
Key Takeaways
- PicoOraClaw is a lightweight, offline AI assistant with local inference via Ollama.
- Oracle AI Database stores sessions, memories, transcripts, prompts, and state with durable ACID-backed persistence.
- Semantic recall happens in the database using ONNX embeddings and vector search, removing the need for an external embedding API.
- The same project runs locally for development and can move to Oracle Cloud Infrastructure (OCI) when you need a managed deployment.
Local-First AI Assistant with Built-In Memory
If you want to build an AI assistant that runs locally, retains meaningful context, and can move to the cloud without rearchitecting the stack, PicoOraClaw is a strong starting point. It pairs a lightweight Go runtime with local inference via Ollama and uses Oracle AI Database as the persistent memory layer.
This matters for developers building edge AI systems, private assistants, or local-first prototypes. Instead of stitching together separate services for storage, embeddings, and retrieval, you can keep memory, state, and semantic recall within Oracle AI Database while still running a lightweight local runtime.
What is PicoOraClaw?
PicoOraClaw is a fork of PicoClaw that keeps the runtime lightweight, uses Ollama as the default inference backend, and adds Oracle AI Database for persistent memory and state.
PicoClaw is an independent open-source project initiated by Sipeed, written entirely in Go from scratch — not a fork of OpenClaw, NanoBot, or any other project.
The result is a developer-friendly architecture for assistants that retain meaningful context and retrieve it semantically, rather than relying on keyword matching. PicoOraClaw targets use cases such as edge AI, IoT, private assistants, and local-first developer workflows, where a small footprint and persistent context matter more than a cloud-only approach. See the PicoOraClaw repository.
Features
- Lightweight Go runtime for local and edge-friendly assistant workflows
- Oracle AI Database-backed memory, state, and semantic recall
- Ollama as the default local inference backend
- Support for multiple LLM providers including OpenRouter, Anthropic, OpenAI, Gemini, DeepSeek, Groq, and Zhipu
- Default: Oracle AI Database Free with Oracle AI Vector Search for semantic memory
- Optional Autonomous AI Database path for managed cloud deployment
- Graceful file-based fallback when Oracle is unavailable
Why Choose PicoOraClaw vs. Standard PicoClaw?
If you’re already familiar with PicoClaw, PicoOraClaw adds a more complete memory layer for developers who need durable context and semantic recall.
- Oracle AI Database as the persistent backend for memories, sessions, transcripts, state, notes, prompts, and configuration
- In-database ONNX embeddings and vector search for semantic memory using
VECTOR_EMBEDDING()andVECTOR_DISTANCE() - Ollama as the default local LLM backend with no cloud dependency
- One-click OCI deployment with Oracle AI Database Free, Ollama, and the PicoOraClaw gateway
- Optional OCI Generative AI integration through the included
oci-genaiproxy oracle-inspectCLI support for inspecting what the assistant stores without writing SQL
Features of PicoOraClaw
What PicoOraClaw enables:
- Unified Memory Core – PicoOraClaw uses Oracle AI Database to store sessions, transcripts, notes, prompts, configuration, and long-term memories in a single persistent system. The database is the memory substrate for long-running, context-aware assistant behavior.
- Build Fast with Modern APIs – Get started locally with a lightweight runtime, Ollama for local inference, and Oracle AI Database Free for semantic memory.
- A Robust Scaling Path – Start locally, keep the same overall architecture, and move to OCI later when you need a managed environment.
Installation – Quick Start (in 5 minutes!)
For the fastest path to a working setup, use the PicoOraClaw one-command installer. It clones, configures, and runs the application in a single step:
curl -fsSL https://raw.githubusercontent.com/oracle-devrel/oracle-ai-developer-hub/refs/heads/main/apps/picooraclaw/install.sh | bash
To control the workspace path, clone the Oracle DevRel repository directly and build from the PicoOraClaw app directory.
Follow the steps below:
Prerequisites
- Go 1.24+
- Ollama
- Docker (for Oracle Database Free)
Step 1: Build
Clone the Oracle DevRel repository, navigate to the PicoOraClaw app folder, and build the binary:
git clone https://github.com/oracle-devrel/oracle-ai-developer-hub.git
cd oracle-ai-developer-hub/apps/picooraclaw
make build
Step 2: Initialize
Initialize the application so it creates the local configuration and working directories:
./build/picooraclaw onboard
Step 3: Start Ollama and pull a model
Ollama is the default and recommended LLM backend for private local inference with no API keys and no cloud dependency.
# Install Ollama if needed: https://ollama.com/download
ollama pull qwen3:latest
Step 4: Configure for Ollama
Edit ~/.picooraclaw/config.json so PicoOraClaw points at your Ollama instance and model:
{
"agents": {
"defaults": {
"provider": "ollama",
"model": "qwen3:latest",
"max_tokens": 8192,
"temperature": 0.7
}
},
"providers": {
"ollama": {
"api_key": "",
"api_base": "http://localhost:11434/v1"
}
}
}
Step 5: Test semantic memory
Once the binary, config, and model are ready, start the assistant and test local conversations:
# One-shot
./build/picooraclaw agent -m "Hello!"
# Interactive mode
./build/picooraclaw agent
At this stage, you have a working local AI assistant with no cloud dependency.
The default LLM backend is Ollama, with an optional alternative for using OCI-hosted models. See oci-genai/README.md for related documentation.
The oci-genai module provides OCI Generative AI as an optional backend for PicoOraClaw. It runs a local OpenAI-compatible proxy that authenticates with OCI using your ~/.oci/config credentials and forwards requests to the OCI GenAI inference endpoint.
Deploying to Oracle Cloud (one-click procedure)
Click here to deploy to Oracle Cloud
This deployment provisions:
- an OCI Compute instance
- Ollama with a model preloaded for CPU inference
- Oracle AI Database Free by default, with an optional Autonomous AI Database path
- the PicoOraClaw gateway as a
systemdservice
You can start locally, keep the same overall architecture, and move to OCI when you need a managed environment.
After deployment, use these commands to verify setup, start chatting, and check gateway health:
# Check setup progress
ssh opc@<public_ip> -t 'tail -f /var/log/picooraclaw-setup.log'
# Start chatting
ssh opc@<public_ip> -t picooraclaw agent
# Check gateway health
curl http://<public_ip>:18790/health
Adding Oracle AI Vector Search
Oracle AI Database provides persistent storage, semantic memory and recall, and crash-safe ACID transactions, with an optional file-based storage mode.
Simply run the setup script:
./scripts/setup-oracle.sh [optional-password]
This script performs the following steps:
- Pulls and starts the Oracle AI Database Free container
- Waits for the database to be ready
- Creates the
picooraclawdatabase user with the required grants - Patches
~/.picooraclaw/config.jsonwith the Oracle connection settings - Runs
picooraclaw setup-oracleto initialize the schema and load the ONNX embedding model
This step gives the assistant durable semantic memory. Instead of relying on local files or ephemeral process state, PicoOraClaw persists and retrieves meaning-based context directly through Oracle AI Vector Search.
Expected output when setup is complete:
── Step 4/4: Schema + ONNX model ─────────────────────────────────────────
Running picooraclaw setup-oracle...
✓ Connected to Oracle AI Database
✓ Schema initialized (8 tables with PICO_ prefix)
✓ ONNX model 'ALL_MINILM_L12_V2' already loaded
✓ VECTOR_EMBEDDING() test passed
✓ Prompts seeded from workspace
════════════════════════════════════════════════════════
Oracle AI Database setup complete!
Test with:
./build/picooraclaw agent -m "Remember that I love Go"
./build/picooraclaw agent -m "What language do I like?"
./build/picooraclaw oracle-inspect
════════════════════════════════════════════════════════
Test semantic memory
Use the following commands to test semantic memory:
# Store a fact
./build/picooraclaw agent -m "Remember that my favorite language is Go"
# Recall by meaning (not keywords)
./build/picooraclaw agent -m "What programming language do I prefer?"
The second command finds the stored memory via cosine similarity on 384-dimensional vectors rather than exact keyword matching.
Inspecting Oracle Data with oracle-inspect
A useful operational feature is oracle-inspect, a CLI tool that lets you inspect stored data without writing SQL.
picooraclaw oracle-inspect [table] [options]
These are the tables:
memories, sessions, transcripts, state, notes, prompts, config, meta
These are the options:
-n <limit> max rows (default 20), -s <text> semantic search (memories only)
To list all memories:
./build/picooraclaw oracle-inspect memories
You can also perform semantic search over memories:
./build/picooraclaw oracle-inspect memories -s "what does the user like to program in"
This is a meaningful developer benefit. Oracle-backed memory is inspectable, debuggable, and operationally visible. You can understand what the assistant stores without building a separate admin layer.
Overview dashboard
Run the following command to view an overview dashboard of stored data:
./build/picooraclaw oracle-inspect
Running the command with no arguments gives you a summary view across tables, recent memory entries, transcripts, sessions, state, notes, prompts, and schema metadata.
=============================================================
PicoOraClaw Oracle AI Database Inspector
=============================================================
Table Rows
───────────────────── ────
Memories 20 ████████████████████
Sessions 4 ████
Transcripts 6 ██████
State 8 ████████
Daily Notes 3 ███
Prompts 4 ████
Config 2 ██
Meta 1 █
───────────────────── ────
Total 48
Tip: Run 'picooraclaw oracle-inspect <table>' for details
Run 'picooraclaw oracle-inspect memories -s "query"' for semantic search
List all memories
Run the following command to list all stored memories:
./build/picooraclaw oracle-inspect memories
All Memories
─────────────────────────────────────────────────────────
ID: faffd019 Vector: yes
Created: 2026-02-19 04:13 Importance: 0.9 Category: preference Accessed: 0x
Content: User prefers Oracle Database as the primary database. They work at Oracle
and prefer Oracle AI Vector Search for embeddings.
ID: 0e39036f Vector: yes
Created: 2026-02-19 04:13 Importance: 0.8 Category: preference Accessed: 0x
Content: Go is the user's primary programming language. They use Go 1.24 and target
embedded Linux devices (RISC-V, ARM64, x86_64).
Semantic search over memories
The following example shows how to perform semantic search over stored memories:
./build/picooraclaw oracle-inspect memories -s "what does the user like to program in"
Semantic Search: "what does the user like to program in"
─────────────────────────────────────────────────────────
[ 61.3% match] ID: 383ff5d3
Created: 2026-02-16 06:13 Importance: 0.7 Category: preference Accessed: 0x
Content: I prefer Python and Go for programming
[ 60.7% match] ID: 0e74a94c
Created: 2026-02-18 02:20 Importance: 0.7 Category: preference Accessed: 0x
Content: my favorite programming language is Go
For deeper inspection of sessions, transcripts, notes, config, prompts, and schema metadata, see the PicoOraClaw app in the Oracle DevRel repository.
Inspect sessions
You can inspect stored chat sessions using the following command:
./build/picooraclaw oracle-inspect sessions
Chat Sessions
─────────────────────────────────────────────────────────
Session: discord:dev-channel
Created: 2026-02-19 04:13 Updated: 2026-02-19 04:13 Messages size: 673 bytes
Session: cli:default
Created: 2026-02-16 06:12 Updated: 2026-02-18 06:07 Messages size: 2848 bytes
Inspect agent state
Inspect the agent’s stored state:
./build/picooraclaw oracle-inspect state
Agent State (Key-Value)
─────────────────────────────────────────────────────────
agent_mode = interactive
last_channel = cli
last_model = gpt-4o-mini
total_conversations = 42
user_name = jasperan
How Oracle Storage Works
The remember tool stores text along with a vector embedding using VECTOR_EMBEDDING(ALL_MINILM_L12_V2 USING :text AS DATA). The recall tool then uses VECTOR_DISTANCE() for cosine similarity search.
With Oracle-backed storage in place, PicoOraClaw supports the following LLM providers:
- Ollama
- OpenRouter
- Zhipu
- Anthropic
- OpenAI
- Gemini
- DeepSeek
- Groq
PicoOraClaw also supports OCI Generative AI as an optional LLM backend for enterprise models via the included oci-genai proxy.
CLI Reference
The following commands cover the core PicoOraClaw workflows:
picooraclaw onboard— initialize config and workspacepicooraclaw agent -m "..."— one-shot chatpicooraclaw agent— interactive chat modepicooraclaw setup-oracle— initialize Oracle schema and ONNX modelpicooraclaw oracle-inspect— inspect data stored in Oracle AI Databasepicooraclaw oracle-inspect memories -s "query"— semantic search over stored memoriespicooraclaw gateway— start the long-running service with channels enabled
Conclusion
PicoOraClaw is more than a lightweight assistant runtime. Combined with Oracle AI Database, it becomes a practical pattern for building assistants that retain context, retrieve facts semantically, and scale from local development to OCI without rearchitecting.
Start small, stay local, add durable semantic memory with Oracle AI Vector Search, and keep a clear path to a managed deployment model when you need it.
Frequently Asked Questions (FAQs)
What hardware do I need to run PicoOraClaw?
PicoOraClaw runs on resource-constrained environments including x86_64, ARM64, and RISC-V platforms with a very small footprint. See the project repository for exact requirements.
How does PicoOraClaw remember information?
PicoOraClaw stores memories, sessions, and related state in Oracle AI Database. It uses in-database ONNX embeddings and vector search to retrieve memory by meaning rather than exact keyword matches.
Do I need an external embedding API?
No, the Oracle-backed memory flow uses in-database embeddings.
Can I run PicoOraClaw fully offline?
Yes. Ollama as the default backend enables fully local inference, making PicoOraClaw suitable for offline or privacy-sensitive workflows.
Can I deploy PicoOraClaw to Oracle Cloud?
Yes. The OCI deployment path provisions compute, Oracle AI Database Free, Ollama, and the PicoOraClaw gateway as a systemd service, with an optional Autonomous AI Database path. Deploy here.
Which LLM providers are supported?
Ollama (default), OpenRouter, Zhipu, Anthropic, OpenAI, Gemini, DeepSeek, Groq, and optional OCI Generative AI integration through the included proxy. See the PicoOraClaw repository for details.
