Integrate BitBucket Pipelines with SonarCloud

Suraj Batuwana
7 min readApr 3, 2024

BBucket pipelines are an essential tool for modern software development, facilitating continuous integration and deployment (CI/CD) workflows. These pipelines automate the process of building, testing, and deploying code changes, ensuring a smooth and efficient development lifecycle. Let’s delve into a hypothetical scenario where a software team utilizes Bitbucket pipelines to streamline their workflow.

Pipeline Configuration:

image: node:14.15.1

pipelines:
branches:
master:
- step:
name: Build and Test
script:
- npm install
- npm run build
- npm test
- step:
name: Deploy to Production
script:
- npm run deploy

Explanation:

Build and Test: Upon pushing changes to the master branch, Bitbucket triggers the first step of the pipeline. This step installs dependencies, builds the project, and runs automated tests to ensure code quality and functionality.

Deploy to Production: Assuming the build and tests are successful, the pipeline proceeds to deploy the changes to the production environment. This step could involve various deployment strategies, such as rolling updates or blue-green deployments, to minimize downtime and ensure a seamless release.

Benefits: Automation: Bitbucket pipelines automate repetitive tasks, reducing manual intervention and potential errors. Developers can focus on writing code rather than managing deployments manually.

Consistency: By defining the pipeline stages in code, teams ensure consistency across environments. Everyone follows the same deployment process, mitigating configuration drift and environment inconsistencies.

Faster Feedback: Pipelines provide instant feedback on code changes. If a build or test fails, developers are immediately notified, enabling them to address issues promptly and maintain a high level of code quality.

Scalability: As the project grows, Bitbucket pipelines scale effortlessly. Whether there are ten or a hundred developers working on the project, the pipeline adapts to the increased workload, ensuring efficient CI/CD processes.

In conclusion, Bitbucket pipelines revolutionize the software development lifecycle by automating repetitive tasks, ensuring consistency, providing fast feedback, and enabling scalable CI/CD workflows. By incorporating pipelines into their workflow, development teams can deliver high-quality software faster and more reliably.

Sonarcloud

Integrating Bitbucket Pipelines with SonarCloud can significantly enhance code quality by performing static code analysis on your project’s codebase. SonarCloud provides comprehensive insights into code quality, security vulnerabilities, and code smells, enabling teams to identify and address issues early in the development process. Here’s how you can seamlessly integrate Bitbucket Pipelines with SonarCloud:

1. Set up SonarCloud:

  • Begin by signing up for a SonarCloud account if you haven’t already. You can log in using your Bitbucket credentials.
  • Create a new project in SonarCloud and generate an authentication token. This token will be used to authenticate the analysis in your Bitbucket pipelines.

2. Configure Bitbucket Pipelines:

  • Navigate to your Bitbucket repository and create or edit your bitbucket-pipelines.yml file to include SonarCloud analysis.
  • Add the necessary environment variables to securely pass the SonarCloud token and project key to your pipeline. You can store these variables securely in the Bitbucket Pipelines settings.

Example Bitbucket Pipelines Configuration:

image: node:14.15.1

pipelines:
branches:
master:
- step:
name: Build and Test
script:
- npm install
- npm run build
- npm test
# SonarCloud Analysis
- pipe: sonarsource/sonarcloud-scan:1.2.0
variables:
SONAR_TOKEN: $SONAR_TOKEN
EXTRA_ARGS: '-Dsonar.projectKey=your_project_key'

In this example, the SonarCloud analysis is integrated as a step in the Bitbucket pipeline. TheSONAR_TOKEN environment variable securely passes the authentication token, and the EXTRA_ARGS parameter specifies the project key.

Run SonarCloud Analysis:

  • With the pipeline configured, push your changes to the repository’s master branch to trigger the pipeline.
  • Bitbucket Pipelines will execute the build and tests, and subsequently run the SonarCloud analysis.
  • The analysis results are then sent to SonarCloud for processing.

Review SonarCloud Results:

  • Once the analysis is complete, navigate to your SonarCloud project dashboard to view the results.
  • SonarCloud provides detailed reports on code quality, including metrics such as code duplication, code coverage, security vulnerabilities, and code smells.
  • Use these insights to identify areas for improvement and prioritize code refactoring or security fixes.

Benefits of Integrating Bitbucket Pipelines with SonarCloud:

  • Automated Code Analysis: SonarCloud seamlessly integrates with Bitbucket Pipelines to automate static code analysis, ensuring consistent code quality checks with every code change.
  • Early Issue Detection: By analyzing code as part of the CI/CD process, teams can identify and address code quality issues early in the development lifecycle, minimizing technical debt and improving overall code maintainability.
  • Actionable Insights: SonarCloud provides actionable insights into code quality, security vulnerabilities, and best practices, empowering teams to make informed decisions and continuously improve their codebase.
  • Integration Flexibility: Bitbucket Pipelines and SonarCloud offer flexible integration options, allowing teams to customize analysis settings and incorporate additional quality gates or checks as needed.

In summary, integrating Bitbucket Pipelines with SonarCloud enhances code quality by automating static code analysis and providing actionable insights into code issues. By leveraging this integration, teams can ensure consistent code quality standards and deliver high-quality software with confidence.

How to handle “The bridge server is unresponsive” and “Analysis of JS/TS files failed”

“The bridge server is unresponsive” is a common issue encountered when using SonarQube or SonarCloud integrations in Bitbucket Pipelines. This error typically occurs when the connection between the pipeline and the Sonar server is interrupted or when there are issues with the Sonar server itself. Here’s how you can handle this issue:

1. Check SonarQube/SonarCloud status:

Before troubleshooting further, ensure that SonarQube or SonarCloud services are up and running. Check the status of the Sonar server by accessing its web interface or respective status pages.

2. Verify Connection Configuration:

  • Review the configuration in your Bitbucket Pipelines script (bitbucket-pipelines.yml) to ensure that the Sonar server URL, authentication credentials, and other settings are correct.
  • Make sure that the Sonar server URL specified in the pipeline configuration is accessible from the pipeline environment.

3. Retry the connection:

  • Sometimes, the “bridge server is unresponsive” error could be transient. Retry the sonar analysis step in the Bitbucket pipeline to see if the issue resolves itself.

4. Check network connectivity:

  • Verify that there are no network issues preventing the Bitbucket Pipelines environment from connecting to the Sonar server. Check for firewall rules, proxy configurations, or network restrictions that might block the connection.

5. Review SonarQube/SonarCloud Logs:

  • Check the logs on the Sonar server to identify any errors or issues that might be causing it to become unresponsive. Look for relevant error messages or stack traces that could provide clues about the root cause.

6. Upgrade SonarQube/SonarCloud:

  • Ensure that you are using the latest version of SonarQube or SonarCloud. Sometimes, issues with connectivity or stability are addressed in newer versions of the software.

7. Contact Support:

  • If you’ve exhausted all troubleshooting steps and are still experiencing the issue, reach out to SonarQube or SonarCloud support for assistance. Provide them with relevant details such as error messages, logs, and configuration settings for faster resolution.

8. Temporary Workaround:

  • As a temporary workaround, you can consider disabling the Sonar analysis step in the Bitbucket Pipelines configuration until the issue with the Sonar server is resolved. This will allow other pipeline steps to continue running without interruption.

By following these steps, you can effectively handle the “bridge server is unresponsive” error when integrating SonarQube or SonarCloud with Bitbucket Pipelines, ensuring smooth and reliable code analysis within your CI/CD workflow.

Sample Bitbucket pipeline

image: maven:3.3.9

definitions:
services:
docker:
memory: 3072
docker-4g:
memory: 2048
type: docker
steps:
step: &build-step
name: SonarQube analysis
size: 2x
max-time: 40
services:
- docker-4g
script:
- pipe: sonarsource/sonarcloud-scan:2.0.0
pipe: sonarsource/sonarcloud-quality-gate:0.1.6
variables:
SONAR_TOKEN: $SONAR_TOKEN
EXTRA_ARGS: '-Dsonar.log.level=TRACE -Dsonar.verbose=true -Dsonar.javascript.node.maxspace=2048 -Dsonar.sourceEncoding=UTF-8 -Dsonar.javascript.maxFileSize=12000 -Dsonar.javascript.detectBundles=false -Dsonar.javascript.node.debugMemory=true -Dsonar.exclusions=node_modules/**/*, -Dsonar.test.inclusions=**/*.test.*,**/*.spec.*,**.int.* -Dsonar.cpd.exclusions=**'
caches:
sonar: ~/.sonar

clone:
depth: full

pipelines:
branches:
master:
- step: *build-step
develop:
- step: *build-step

If need to use branch pattern, you can write like this

pipelines:
branches:
'*': // for any branch
- step: *build-step
'features/*': // for any branch starts with features/
- step: *build-step
'bugfixs/*': for any branch starts with bugfixs/
- step: *build-step

The above Bitbucket Pipeline configuration is designed to perform SonarQube analysis on a Maven project using a Docker container. Below is a breakdown of the configuration:

1. Docker Image:

  • The pipeline is using the Maven 3.3.9 Docker image as the base image for running the pipeline steps.

2. Service Definitions:

  • Two Docker services are defined:
  • docker with a memory limit of 3072 MB. These values can be changed
  • docker-4g with a memory limit of 2048 MB and specified as the service for the SonarQube analysis step. These values can be changed

3. Pipeline Steps:

  • A single step named “SonarQube analysis” is defined using the YAML anchor (&build-step) and reference (*build-step) syntax.
  • This step uses a 2x size and a maximum execution time of 40 minutes.
  • It specifies the docker-4g service for execution.
  • The script for this step involves two pipe commands:
  • The first pipe command (sonarsource/sonarcloud-scan:2.0.0) performs the SonarQube analysis.
  • The second pipe command (sonarsource/sonarcloud-quality-gate:0.1.6) checks the quality gate status.
  • Various variables are passed to the second pipe command, including the Sonar token (SONAR_TOKEN) and additional arguments for configuring SonarQube analysis.

4. Caches:

  • A cache namedsonar is defined to store SonarQube analysis data in the ~/.sonar directory for future pipeline runs.

5. Clone Depth:

  • Full clone depth is specified to ensure that the entire Git history is cloned for each branch.

6. Branches:

  • Two branches (masterand)develop) are configured to trigger the pipeline. Both branches execute the same SonarQube analysis step defined in thebuild-step.

In summary, this Bitbucket Pipeline configuration runs SonarQube analysis on Maven projects for both the master and develop branches using a Docker container with specified memory limits. It ensures code quality and compliance with quality gates as part of the CI/CD process.

--

--

Suraj Batuwana

Technology Evangelist, Technical Blogger with multidisciplinary skills with experience in full spectrum of design, architecture and development