Installation
This guide covers different installation methods for the S3 Documentation MCP server.
Prerequisites
Section titled “Prerequisites”Before installing, ensure you have:
Required
Section titled “Required”- S3-compatible storage with credentials (AWS S3, MinIO, Scaleway, etc.)
- One of the following:
- Docker (recommended) - Install Docker
- Node.js >= 18 - Install Node.js
Embedding Provider (choose one)
Section titled “Embedding Provider (choose one)”Choose your preferred embedding provider:
Option 1: Ollama (Local, Free)
Section titled “Option 1: Ollama (Local, Free)”Recommended for: Local development, offline usage, privacy-conscious deployments
- Install Ollama from https://ollama.ai
- Pull the embedding model:
Terminal window ollama pull nomic-embed-text
Pros:
- ✅ Free, unlimited usage
- ✅ All data stays local
- ✅ Works offline
- ✅ Fast local API calls
Cons:
- ⚠️ Requires local resources (CPU/GPU)
- ⚠️ Slightly lower accuracy than cloud models
Option 2: OpenAI (Cloud)
Section titled “Option 2: OpenAI (Cloud)”Recommended for: Production deployments, multilingual content, maximum accuracy
- Get an API key from OpenAI Platform
- Add credits to your account (very affordable: ~$0.00002/1K tokens)
Pros:
- ✅ State-of-the-art accuracy
- ✅ Excellent multilingual support
- ✅ No local resources needed
- ✅ Fast API responses
Cons:
- ⚠️ Requires API key and credits
- ⚠️ Data sent to OpenAI servers
Installation Methods
Section titled “Installation Methods”Method 1: Docker (Recommended)
Section titled “Method 1: Docker (Recommended)”Pull the official image from Docker Hub:
docker pull yoanbernabeu/s3-doc-mcp:latestOr build from source:
git clone https://github.com/yoanbernabeu/S3-Documentation-MCP-Server.gitcd S3-Documentation-MCP-Serverdocker build -t s3-doc-mcp .Method 2: Docker Compose
Section titled “Method 2: Docker Compose”Create a docker-compose.yml file or use the provided one:
version: '3.8'
services: s3-doc-mcp: image: yoanbernabeu/s3-doc-mcp:latest container_name: s3-doc-mcp ports: - "3000:3000" env_file: - .env environment: - OLLAMA_BASE_URL=http://host.docker.internal:11434 volumes: - ./data:/app/data restart: unless-stoppedRun with:
docker compose up -dMethod 3: From Source
Section titled “Method 3: From Source”Clone the repository:
git clone https://github.com/yoanbernabeu/S3-Documentation-MCP-Server.gitcd S3-Documentation-MCP-ServerInstall dependencies:
npm installBuild and start:
# Productionnpm run buildnpm start
# Developmentnpm run devConfiguration
Section titled “Configuration”After installation, you need to configure your environment variables:
-
Copy the example configuration:
Terminal window cp env.example .env -
Edit
.envwith your S3 credentials and settings -
See the Environment Variables page for detailed configuration options.
Verification
Section titled “Verification”Test that your server is running:
curl http://localhost:3000/healthYou should see a successful health check response.
Troubleshooting
Section titled “Troubleshooting”Docker Container Won’t Start
Section titled “Docker Container Won’t Start”- Check logs:
docker logs s3-doc-mcp - Verify
.envfile exists and contains valid credentials - Ensure port 3000 is not already in use
Ollama Connection Issues
Section titled “Ollama Connection Issues”- Verify Ollama is running:
ollama list - Check the
OLLAMA_BASE_URLin your.envfile - For Docker, use
http://host.docker.internal:11434 - For local installation, use
http://localhost:11434
S3 Connection Issues
Section titled “S3 Connection Issues”- Verify your S3 credentials are correct
- Check the bucket name and region
- For non-AWS S3, ensure
S3_ENDPOINTis set correctly