Skip to content

Installation

This guide covers different installation methods for the S3 Documentation MCP server.

Before installing, ensure you have:

  • S3-compatible storage with credentials (AWS S3, MinIO, Scaleway, etc.)
  • One of the following:

Choose your preferred embedding provider:

Recommended for: Local development, offline usage, privacy-conscious deployments

  1. Install Ollama from https://ollama.ai
  2. Pull the embedding model:
    Terminal window
    ollama pull nomic-embed-text

Pros:

  • ✅ Free, unlimited usage
  • ✅ All data stays local
  • ✅ Works offline
  • ✅ Fast local API calls

Cons:

  • ⚠️ Requires local resources (CPU/GPU)
  • ⚠️ Slightly lower accuracy than cloud models

Recommended for: Production deployments, multilingual content, maximum accuracy

  1. Get an API key from OpenAI Platform
  2. Add credits to your account (very affordable: ~$0.00002/1K tokens)

Pros:

  • ✅ State-of-the-art accuracy
  • ✅ Excellent multilingual support
  • ✅ No local resources needed
  • ✅ Fast API responses

Cons:

  • ⚠️ Requires API key and credits
  • ⚠️ Data sent to OpenAI servers

Pull the official image from Docker Hub:

Terminal window
docker pull yoanbernabeu/s3-doc-mcp:latest

Or build from source:

Terminal window
git clone https://github.com/yoanbernabeu/S3-Documentation-MCP-Server.git
cd S3-Documentation-MCP-Server
docker build -t s3-doc-mcp .

Create a docker-compose.yml file or use the provided one:

version: '3.8'
services:
s3-doc-mcp:
image: yoanbernabeu/s3-doc-mcp:latest
container_name: s3-doc-mcp
ports:
- "3000:3000"
env_file:
- .env
environment:
- OLLAMA_BASE_URL=http://host.docker.internal:11434
volumes:
- ./data:/app/data
restart: unless-stopped

Run with:

Terminal window
docker compose up -d

Clone the repository:

Terminal window
git clone https://github.com/yoanbernabeu/S3-Documentation-MCP-Server.git
cd S3-Documentation-MCP-Server

Install dependencies:

Terminal window
npm install

Build and start:

Terminal window
# Production
npm run build
npm start
# Development
npm run dev

After installation, you need to configure your environment variables:

  1. Copy the example configuration:

    Terminal window
    cp env.example .env
  2. Edit .env with your S3 credentials and settings

  3. See the Environment Variables page for detailed configuration options.

Test that your server is running:

Terminal window
curl http://localhost:3000/health

You should see a successful health check response.

  • Check logs: docker logs s3-doc-mcp
  • Verify .env file exists and contains valid credentials
  • Ensure port 3000 is not already in use
  • Verify Ollama is running: ollama list
  • Check the OLLAMA_BASE_URL in your .env file
  • For Docker, use http://host.docker.internal:11434
  • For local installation, use http://localhost:11434
  • Verify your S3 credentials are correct
  • Check the bucket name and region
  • For non-AWS S3, ensure S3_ENDPOINT is set correctly