How to Install SonarQube with Docker: Complete Guide for 2026
Install SonarQube with Docker and Docker Compose. Covers PostgreSQL setup, sonar-scanner config, production tips, and cloud alternatives.
Published:
SonarQube is the most widely deployed static analysis platform in the world, and Docker is the fastest way to get it running. Whether you need a local instance for personal projects or a production deployment for your engineering team, running SonarQube in Docker containers eliminates the complexity of manual Java installations, database configuration, and dependency management.
This guide covers everything from a single docker run command for quick evaluation to a full Docker Compose setup with PostgreSQL, persistent volumes, scanner configuration, and production hardening. By the end, you will have a working SonarQube instance analyzing your code and catching bugs, vulnerabilities, and code smells before they reach production.
If you are looking for a broader setup guide that also covers CI/CD integration and PR decoration, see our complete SonarQube setup guide.
Prerequisites
Before you begin, make sure the following tools are installed on your machine:
- Docker Engine version 20.10 or later (check with
docker --version) - Docker Compose version 2.0 or later (check with
docker compose version) - At least 4 GB of RAM allocated to Docker - SonarQube’s embedded Elasticsearch is memory-hungry
- 2 CPU cores minimum - Elasticsearch and the SonarQube compute engine both need processing power
On Linux hosts, you also need to set a kernel parameter before SonarQube will start:
# Required for Elasticsearch - set temporarily
sudo sysctl -w vm.max_map_count=524288
# Set permanently (survives reboots)
echo "vm.max_map_count=524288" | sudo tee -a /etc/sysctl.conf
sudo sysctl -p
On macOS and Windows with Docker Desktop, this setting is handled automatically. You do not need to run the sysctl command.
Quick start with docker run
If you just want to try SonarQube quickly without setting up a database, a single docker run command gets you up and running in under two minutes. This approach uses SonarQube’s embedded H2 database, which is fine for evaluation but not supported for production.
docker run -d \
--name sonarqube \
-p 9000:9000 \
sonarqube:lts-community
This command pulls the SonarQube LTS Community Edition image, starts it in the background, and maps port 9000. Wait about 60 to 90 seconds for the application to initialize, then open http://localhost:9000 in your browser.
The default credentials are:
- Username:
admin - Password:
admin
SonarQube will immediately prompt you to change the default password. Set a strong password and save it - you will need it later for generating scanner tokens.
To check the container logs during startup:
docker logs -f sonarqube
When you see the message SonarQube is operational, the server is ready to use.
To stop and remove the container later:
docker stop sonarqube && docker rm sonarqube
This quick-start approach is useful for a first look at SonarQube’s interface and features, but the data is stored inside the container and will be lost when the container is removed. For anything beyond a quick test, use Docker Compose with PostgreSQL.
Docker Compose setup with PostgreSQL
A production-ready SonarQube deployment needs an external PostgreSQL database. Docker Compose lets you define both services in a single file and manage them together. This is the recommended approach for any team or long-running installation.
Create a project directory and add a docker-compose.yml file:
mkdir sonarqube-docker && cd sonarqube-docker
Now create the docker-compose.yml with the following content:
services:
sonarqube:
image: sonarqube:lts-community
container_name: sonarqube
depends_on:
db:
condition: service_healthy
environment:
SONAR_JDBC_URL: jdbc:postgresql://db:5432/sonarqube
SONAR_JDBC_USERNAME: sonar
SONAR_JDBC_PASSWORD: sonar_password
ports:
- "9000:9000"
volumes:
- sonarqube_data:/opt/sonarqube/data
- sonarqube_extensions:/opt/sonarqube/extensions
- sonarqube_logs:/opt/sonarqube/logs
networks:
- sonarnet
restart: unless-stopped
db:
image: postgres:16
container_name: sonarqube-db
environment:
POSTGRES_USER: sonar
POSTGRES_PASSWORD: sonar_password
POSTGRES_DB: sonarqube
volumes:
- postgresql_data:/var/lib/postgresql/data
networks:
- sonarnet
healthcheck:
test: ["CMD-SHELL", "pg_isready -U sonar -d sonarqube"]
interval: 10s
timeout: 5s
retries: 5
restart: unless-stopped
volumes:
sonarqube_data:
sonarqube_extensions:
sonarqube_logs:
postgresql_data:
networks:
sonarnet:
driver: bridge
There are several important details in this configuration:
- The
depends_onblock withcondition: service_healthyensures SonarQube does not start until PostgreSQL is ready to accept connections. Without this, SonarQube may crash on startup because the database is not available yet. - Three named volumes persist SonarQube’s data, plugins, and logs across container restarts and upgrades. A fourth volume persists the PostgreSQL data.
- Both services share a dedicated bridge network (
sonarnet) so SonarQube can reach PostgreSQL using the service hostnamedb. - The
restart: unless-stoppedpolicy ensures both containers come back up after a host reboot.
Start the services:
docker compose up -d
Monitor the startup process:
docker compose logs -f sonarqube
After 60 to 120 seconds, you should see SonarQube is operational in the logs. Open http://localhost:9000, log in with admin/admin, and change your password when prompted.
Configuration: environment variables and volumes
SonarQube’s Docker image supports extensive configuration through environment variables. This section covers the most important ones for Docker deployments.
Essential environment variables
These variables are set in the environment block of your docker-compose.yml:
environment:
# Database connection (required for production)
SONAR_JDBC_URL: jdbc:postgresql://db:5432/sonarqube
SONAR_JDBC_USERNAME: sonar
SONAR_JDBC_PASSWORD: sonar_password
# Web server settings
SONAR_WEB_PORT: 9000
SONAR_WEB_CONTEXT: /sonarqube # Access at /sonarqube instead of /
SONAR_WEB_HOST: 0.0.0.0
# JVM memory settings
SONAR_WEB_JAVAOPTS: "-Xmx512m -Xms128m"
SONAR_CE_JAVAOPTS: "-Xmx1024m -Xms128m"
SONAR_SEARCH_JAVAOPTS: "-Xmx512m -Xms512m"
# Telemetry (optional - disable if desired)
SONAR_TELEMETRY_ENABLE: "false"
The JVM memory options are split into three components. SONAR_WEB_JAVAOPTS controls the web server, SONAR_CE_JAVAOPTS controls the compute engine that processes analysis reports, and SONAR_SEARCH_JAVAOPTS controls the embedded Elasticsearch instance. Adjust these based on your server’s available memory and the size of the codebases you analyze.
Volume breakdown
Each SonarQube volume serves a specific purpose:
| Volume path | Purpose | What it stores |
|---|---|---|
/opt/sonarqube/data | Analysis data | Elasticsearch indices and embedded database files |
/opt/sonarqube/extensions | Plugins | Installed language plugins and community extensions |
/opt/sonarqube/logs | Log files | Web server, compute engine, and Elasticsearch logs |
Never mount a volume to /opt/sonarqube/conf - the Docker image manages configuration internally through environment variables. Overriding the conf directory can cause startup failures.
Using an environment file
For cleaner configuration, move sensitive values to a .env file:
# .env
POSTGRES_USER=sonar
POSTGRES_PASSWORD=a_strong_password_here
POSTGRES_DB=sonarqube
SONAR_JDBC_URL=jdbc:postgresql://db:5432/sonarqube
Reference these in your docker-compose.yml:
environment:
SONAR_JDBC_URL: ${SONAR_JDBC_URL}
SONAR_JDBC_USERNAME: ${POSTGRES_USER}
SONAR_JDBC_PASSWORD: ${POSTGRES_PASSWORD}
Add .env to your .gitignore to keep credentials out of version control.
Running your first scan with sonar-scanner
With SonarQube running, the next step is scanning a project. SonarQube does not analyze code on its own - it receives results from a client-side scanner. The easiest option for Docker-based setups is running the scanner as a Docker container too.
Step 1: Create a project in SonarQube
- Log in to
http://localhost:9000 - Click Create Project and select Manually
- Enter a project key (for example,
my-app) and display name - Click Set Up, then select Locally
- Generate an authentication token and copy it
Step 2: Create sonar-project.properties
In the root of the project you want to scan, create a sonar-project.properties file:
# Project identification
sonar.projectKey=my-app
sonar.projectName=My Application
sonar.projectVersion=1.0.0
# Source configuration
sonar.sources=src
sonar.sourceEncoding=UTF-8
# Exclusions
sonar.exclusions=\
**/node_modules/**,\
**/dist/**,\
**/build/**,\
**/coverage/**,\
**/*.min.js
# Coverage reports (if applicable)
sonar.javascript.lcov.reportPaths=coverage/lcov.info
Do not include sonar.host.url or sonar.token in this file - pass them as environment variables instead to avoid committing credentials.
Step 3: Run the scanner
You can run the scanner locally or via Docker.
Using the Docker-based scanner (no local install required):
docker run --rm \
-e SONAR_HOST_URL="http://host.docker.internal:9000" \
-e SONAR_TOKEN="your-token-here" \
-v "$(pwd):/usr/src" \
sonarsource/sonar-scanner-cli
The host.docker.internal hostname lets the scanner container reach the SonarQube container running on your host. On Linux, you may need to add --network host and use http://localhost:9000 instead.
Using a locally installed scanner:
# Install on macOS
brew install sonar-scanner
# Run the scan
sonar-scanner \
-Dsonar.host.url=http://localhost:9000 \
-Dsonar.token=your-token-here
The scanner will analyze your source files, upload the results to SonarQube, and print a URL where you can view the analysis. Open the SonarQube dashboard to see metrics for bugs, vulnerabilities, code smells, coverage, and duplication.
For a deeper walkthrough of scanner options and CI/CD integration, see our complete SonarQube setup guide.
Production considerations
Running SonarQube in Docker for a team or organization requires additional hardening beyond the basic Docker Compose setup. These production considerations address memory, database tuning, reverse proxy configuration, and upgrades.
Memory and resource limits
Set explicit resource limits in your Docker Compose file to prevent SonarQube from consuming all available memory on the host:
services:
sonarqube:
image: sonarqube:lts-community
deploy:
resources:
limits:
memory: 4G
cpus: "2.0"
reservations:
memory: 2G
cpus: "1.0"
# ... rest of configuration
For the PostgreSQL container, 1 GB of memory is usually sufficient unless you have hundreds of projects:
db:
image: postgres:16
deploy:
resources:
limits:
memory: 1G
Database tuning
The default PostgreSQL configuration is conservative. For better performance with SonarQube, add tuning parameters:
db:
image: postgres:16
command:
- "postgres"
- "-c"
- "shared_buffers=256MB"
- "-c"
- "effective_cache_size=768MB"
- "-c"
- "work_mem=16MB"
- "-c"
- "maintenance_work_mem=128MB"
- "-c"
- "max_connections=100"
These values work well for a SonarQube instance serving 10 to 50 projects. Scale up shared_buffers and effective_cache_size proportionally for larger deployments.
Reverse proxy with Nginx
For production deployments, put SonarQube behind a reverse proxy with SSL termination. Add an Nginx service to your Docker Compose file:
nginx:
image: nginx:alpine
container_name: sonarqube-proxy
ports:
- "443:443"
- "80:80"
volumes:
- ./nginx.conf:/etc/nginx/conf.d/default.conf:ro
- ./certs:/etc/nginx/certs:ro
depends_on:
- sonarqube
networks:
- sonarnet
restart: unless-stopped
Create the nginx.conf file:
upstream sonarqube {
server sonarqube:9000;
}
server {
listen 80;
server_name sonar.yourcompany.com;
return 301 https://$server_name$request_uri;
}
server {
listen 443 ssl;
server_name sonar.yourcompany.com;
ssl_certificate /etc/nginx/certs/fullchain.pem;
ssl_certificate_key /etc/nginx/certs/privkey.pem;
client_max_body_size 50M;
location / {
proxy_pass http://sonarqube;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto https;
}
}
After setting up the proxy, tell SonarQube about its public URL by adding this environment variable:
environment:
SONAR_WEB_CONTEXT: /
sonar.core.serverBaseURL: https://sonar.yourcompany.com
Backup strategy
Back up both the PostgreSQL data and SonarQube volumes regularly:
# Backup PostgreSQL
docker exec sonarqube-db pg_dump -U sonar sonarqube > backup_$(date +%Y%m%d).sql
# Backup volumes (stop SonarQube first for consistency)
docker compose stop sonarqube
docker run --rm \
-v sonarqube-docker_sonarqube_data:/data \
-v $(pwd)/backups:/backup \
alpine tar czf /backup/sonarqube_data_$(date +%Y%m%d).tar.gz -C /data .
docker compose start sonarqube
Upgrading SonarQube
To upgrade to a new version:
# Stop the stack
docker compose down
# Update the image tag in docker-compose.yml
# e.g., sonarqube:lts-community -> sonarqube:2025.1-community
# Pull the new image and restart
docker compose pull
docker compose up -d
SonarQube handles database migrations automatically on startup. Always back up your database before upgrading. Check the SonarQube release notes for any breaking changes between versions.
Troubleshooting common issues
SonarQube crashes on startup with “max virtual memory” error
This is the most common Docker issue. SonarQube’s embedded Elasticsearch requires vm.max_map_count to be at least 524288.
# Check the current value
sysctl vm.max_map_count
# Set it (Linux only)
sudo sysctl -w vm.max_map_count=524288
On Docker Desktop for macOS or Windows, increase the memory allocation in Docker Desktop settings to at least 4 GB.
Database connection refused
If SonarQube logs show Connection refused or FATAL: database does not exist, check these items:
- Verify PostgreSQL is running:
docker compose ps db - Confirm the database name, username, and password match between the SonarQube and PostgreSQL environment variables
- Make sure both containers are on the same Docker network
- Wait for the PostgreSQL health check to pass before SonarQube starts (use the
depends_oncondition shown above)
SonarQube runs out of disk space
SonarQube’s Elasticsearch indices grow over time. If the Docker host runs low on disk:
# Check volume sizes
docker system df -v
# Prune unused Docker resources
docker system prune -a --volumes
Inside SonarQube, navigate to Administration, then Housekeeping to configure how long analysis snapshots are retained. Reducing retention from the default 30 days to 14 days significantly reduces disk usage.
Scanner cannot reach SonarQube from another container
When running the scanner as a Docker container, it cannot reach localhost on the host machine. Use one of these approaches:
- On macOS and Windows: use
http://host.docker.internal:9000 - On Linux with the default bridge network: use the container’s IP address (find it with
docker inspect sonarqube) - If both containers are on the same Docker network: use the service name
http://sonarqube:9000
Port 9000 is already in use
If another application is using port 9000, map SonarQube to a different host port:
ports:
- "9090:9000" # Access at http://localhost:9090
Cloud alternatives to self-hosted Docker
Running SonarQube in Docker gives you full control over your data and infrastructure, but it also means you are responsible for maintenance, updates, backups, and scaling. If you prefer a managed experience, consider these alternatives.
SonarQube Cloud
SonarQube Cloud (formerly SonarCloud) is the SaaS version of SonarQube, hosted and managed by SonarSource. It uses the same analysis engine but eliminates all infrastructure management. SonarQube Cloud is free for open-source projects and starts at $14 per month for private repositories. PR decoration and branch analysis work on all tiers, including the free plan - features that require Developer Edition ($180 per year per instance) on self-hosted SonarQube. For a detailed comparison, see our SonarQube vs SonarCloud breakdown.
CodeAnt AI
CodeAnt AI takes a different approach to code quality by combining static analysis with AI-powered code review. At $24 to $40 per user per month, CodeAnt AI provides automated PR reviews, security scanning, code quality checks, and auto-fixes with no infrastructure to manage. It connects directly to your GitHub, GitLab, or Bitbucket repositories and starts analyzing pull requests immediately. For teams that want both static analysis and AI code review in a single platform without managing Docker containers, CodeAnt AI is worth evaluating.
For a broader look at options, see our guides on SonarQube alternatives and free SonarQube alternatives. If pricing is a key factor, our SonarQube pricing breakdown covers the full cost picture across all editions.
Conclusion
Docker is the most reliable way to deploy SonarQube. A single docker run command gets you started in minutes, and a Docker Compose file with PostgreSQL gives you a production-grade setup that persists data, handles restarts, and scales with your team.
Here is a summary of what we covered:
- Quick start - a one-liner
docker runcommand for evaluation - Docker Compose - a complete setup with PostgreSQL, named volumes, and health checks
- Configuration - environment variables, JVM tuning, and volume management
- Scanner setup - running sonar-scanner via Docker or local install
- Production hardening - resource limits, database tuning, Nginx reverse proxy, backups, and upgrades
- Troubleshooting - solutions for the most common Docker-related SonarQube issues
For teams that want to avoid the operational overhead of self-hosting entirely, SonarQube Cloud and CodeAnt AI ($24-$40/user/month) offer managed alternatives with no Docker infrastructure required.
Once your SonarQube instance is up and running, the next step is integrating it into your CI/CD pipeline. Our complete SonarQube setup guide covers GitHub Actions integration, quality gates, PR decoration, and branch protection in detail.
Frequently Asked Questions
How do I install SonarQube with Docker?
Pull the official SonarQube image with 'docker pull sonarqube:lts-community' and run it with 'docker run -d --name sonarqube -p 9000:9000 sonarqube:lts-community'. For production use, pair it with a PostgreSQL database using Docker Compose. The web interface is available at http://localhost:9000 with default credentials admin/admin.
What is the best Docker image for SonarQube?
Use sonarqube:lts-community for production stability. The LTS tag receives long-term support with bug fixes and security patches. The community tag is the free, open-source edition. For the latest features, use sonarqube:latest-community, but LTS is recommended for production deployments.
Does SonarQube Docker require PostgreSQL?
SonarQube ships with an embedded H2 database that works for quick evaluation, but H2 is not supported for production. For any real usage, you must connect SonarQube to an external PostgreSQL database (versions 13 through 16 are supported). Docker Compose makes it easy to run both containers together.
How much memory does SonarQube Docker need?
Allocate at least 4 GB of RAM to Docker for SonarQube. The application itself uses around 2 GB, and its embedded Elasticsearch instance requires additional memory. For larger codebases or enterprise editions, allocate 6 to 8 GB. Set vm.max_map_count to at least 524288 on Linux hosts.
How do I persist SonarQube data in Docker?
Use Docker named volumes or bind mounts for three directories - /opt/sonarqube/data for analysis data, /opt/sonarqube/extensions for installed plugins, and /opt/sonarqube/logs for log files. In Docker Compose, define named volumes and map them to these paths so data survives container restarts and upgrades.
Why does SonarQube fail to start in Docker with a vm.max_map_count error?
SonarQube embeds Elasticsearch, which requires the Linux kernel parameter vm.max_map_count to be at least 524288. On Linux, run 'sudo sysctl -w vm.max_map_count=524288' to set it temporarily or add it to /etc/sysctl.conf for persistence. Docker Desktop on macOS and Windows handles this automatically.
How do I run sonar-scanner with Docker?
Use the official sonar-scanner-cli Docker image. Run 'docker run --rm -e SONAR_HOST_URL=http://host.docker.internal:9000 -e SONAR_TOKEN=your-token -v $(pwd):/usr/src sonarsource/sonar-scanner-cli' from your project root. This avoids installing the scanner locally and ensures a consistent scanner version.
Can I use Docker Compose to run SonarQube with PostgreSQL?
Yes. Create a docker-compose.yml with two services - sonarqube using the sonarqube:lts-community image and db using the postgres:16 image. Set the SONAR_JDBC_URL, SONAR_JDBC_USERNAME, and SONAR_JDBC_PASSWORD environment variables on the SonarQube service to connect to PostgreSQL. Use named volumes for both services to persist data.
How do I upgrade SonarQube in Docker?
Stop the current containers with 'docker compose down', update the image tag in your docker-compose.yml to the new version, pull the new image with 'docker compose pull', and start the containers with 'docker compose up -d'. SonarQube automatically handles database migrations on startup. Always back up your PostgreSQL data volume before upgrading.
How do I put SonarQube behind a reverse proxy with Docker?
Add an Nginx or Traefik container to your Docker Compose file. Configure it to proxy traffic from port 443 to SonarQube's port 9000. Set the sonar.core.serverBaseURL property in SonarQube to your public HTTPS URL. Add SSL certificates using Let's Encrypt with Certbot or mount existing certificates into the proxy container.
What is the difference between SonarQube Docker and SonarQube Cloud?
SonarQube Docker is self-hosted - you run and manage the infrastructure yourself using Docker containers. SonarQube Cloud (formerly SonarCloud) is a fully managed SaaS platform hosted by SonarSource. Cloud requires no infrastructure management, is free for public repositories, and includes PR decoration on all tiers. Docker gives you full data control and works in air-gapped environments.
Are there simpler alternatives to self-hosting SonarQube with Docker?
Yes. SonarQube Cloud offers the same analysis engine as a managed SaaS with no Docker infrastructure needed. CodeAnt AI provides AI-powered code review at $24 to $40 per user per month with zero infrastructure to manage. Both options eliminate the operational overhead of maintaining Docker containers, databases, and server updates.
Explore More
Tool Reviews
Related Articles
Free Newsletter
Stay ahead with AI dev tools
Weekly insights on AI code review, static analysis, and developer productivity. No spam, unsubscribe anytime.
Join developers getting weekly AI tool insights.
Related Articles
Codacy GitHub Integration: Complete Setup and Configuration Guide
Learn how to integrate Codacy with GitHub step by step. Covers GitHub App install, PR analysis, quality gates, coverage reports, and config.
March 13, 2026
how-toCodacy GitLab Integration: Setup and Configuration Guide (2026)
Set up Codacy with GitLab step by step. Covers OAuth, project import, MR analysis, quality gates, coverage reporting, and GitLab CI config.
March 13, 2026
how-toHow to Set Up Codacy with Jenkins for Automated Review
Set up Codacy with Jenkins for automated code review. Covers plugin setup, Jenkinsfile config, quality gates, coverage, and multibranch pipelines.
March 13, 2026
SonarQube Review
CodeAnt AI Review