Sunday, December 26, 2021

Azure DevOps Terraform Integration | How do you integrate Terraform with Azure DevOps | Automate Infrastructure setup using Terraform and Azure DevOps | Remote Store in S3 Bucket

We will be learning an interesting use case to provision resources in AWS cloud using Terraform and Azure DevOps. We will also learn how to store terraform state info remotely in AWS S3 bucket.

We will create S3 bucket for storing terraform state info and Dynamo DB table for providing state lock capability. 

We will try to create an EC2 instance and S3 Bucket using Terraform and Azure DevOps in AWS cloud. Look at the diagram that describes the whole flow. 


Watch the steps in YouTube channel:
    Pre-requisites:
    • Azure DevOps organization
    • Add Azure pipelines Terraform tasks 
    • Create AWS service connection in Azure DevOps for Terraform to use
    • Create service connection for connecting to GitHub
    • Create S3 bucket for storing TF state
    • Create dynamo DB table for providing lock capability
    • I have provided my public repo as an example which you can use.

    Step # 1 - Create S3 Bucket:
    Login to AWS, S3. Click on create S3 bucket.

    Give unique name to the bucket, name needs to be unique.

    Block all public access, enable bucket versioning as well.

    Enable encryption.


    Step # 2 - Create DynamoDB Table
    Create a new table with LockID as partition Key



    Step # 3 - Create Service connection to connect to AWS from Azure DevOps
    Go to Azure Devops, select your project. Project Settings

    Click Service Connections




    Select AWS for Terraform
    Enter Access Key, Secret Key and region code, enter name for service connection and choose Grant Access to all pipelines.


    Click Save.

    Create Service connection for connecting to GitHub

    Save

    Step 4 - Create a new Release Pipeline
    Click on Releases, New, choose New Release pipeline


    Select empty job. 

    Click on Add Artifacts


    Choose GitHub, select Github service connection, select the repo


    Click on Add tasks, type terraform
    choose the task


    Add task, search for install terraform and select installer task

    It should show something like this:
    Add another terraform task for init

    Search for terraform and add task

    select AWS from drop down, choose init as command and add -reconfigure as additional command arguments. Select AWS service connection and enter bucket name


    Add another task for plan, select right values.
    enter -out dev-plan 


    Add another task by cloning for apply.
    enter "dev-plan" as additional arguments


    Click on Save. 
    Create Release, now job should be running.





    Login to AWS--> S3 Bucket, you should see terraform state info 


    How to destroy all the resources created using Terraform?

    Clone the current infra setup release pipeline.

    Modify the pipeline name.

    Add 
    Modify the apply task to as shown in the diagram
    enter -destroy as additional argument.


    Click on Create Release to make sure all the resources are destroyed.


    Thursday, December 23, 2021

    Jenkins Terraform Integration | How do you integrate Terraform with Jenkins | Automate Infrastructure setup using Terraform and Jenkins | Remote Store in S3 Bucket

    We will be learning how to provision resources in AWS cloud using Terraform and Jenkins. We will also learn how to store terraform state info remotely in AWS S3 bucket.

    We will create S3 bucket for storing terraform state info and Dynamo DB table for locking capability. 

    We will try to create an EC2 instance and S3 Bucket using Terraform and Jenkins in AWS cloud. Look at the diagram that describes the whole flow. 

    Watch these steps in action in YouTube channel:



    Pre-requisites:
    • Create S3 bucket for storing TF state
    • Create dynamo DB table for providing lock capability
    • Jenkins is up and running
    • Terraform is installed in Jenkins
    • Terraform files already created in your SCM
    • Make sure you have necessary IAM role created with right policy and attached to Jenkins EC2 instance. see below for the steps to create IAM role.
    I have provided my public repo as an example which you can use. You can fork my repo and start making changes in your repo.

    Step # 1 - Create S3 Bucket:
    Login to AWS, S3. Click on create S3 bucket.

    Give unique name to the bucket, name needs to be unique.

    Block all public access, enable bucket versioning as well.

    Enable encryption.


    Step # 2 - Create DynamoDB Table
    Create a new table with LockID as partition Key



    Step - 3 Create IAM role to provision EC2 instance in AWS 



    Select AWS service, EC2, Click on Next Permissions


    Type EC2 and choose AmazonEC2FullAccess as policy and type S3 and add AmazonS3FullAccess, type Dynamo

    Attach three policies

     

    Click on Next tags, Next Review
    give some role name and click on Create role.



    Step 4 - Assign IAM role to EC2 instance

    Go back to Jenkins EC2 instance, click on EC2 instance, Security, Modify IAM role


    Type your IAM role name my-ec2-terraform-role and Save to attach that role to EC2 instance.




    Step 5 - Create a new Jenkins Pipeline

    Give a name to the pipeline you are creating.



    Step 6 - Add parameters to the pipeline

    Click checkbox - This project is parameterized, choose Choice Parameter


    Enter name as action
    type apply and enter and type destroy as choices as it is shown below(it should be in two lines)


    Go to Pipeline section

    Add below pipeline code and modify per your GitHub repo configuration.

    pipeline {
        agent any

        stages {
            stage('Checkout') {
                steps {
                checkout scm
                }
            }
            
            stage ("terraform init") {
                steps {
                    sh ('terraform init -reconfigure') 
                }
            }
            stage ("terraform plan") {
                steps {
                    sh ('terraform plan') 
                }
            }
                    
            stage ("terraform Action") {
                steps {
                    echo "Terraform action is --> ${action}"
                    sh ('terraform ${action} --auto-approve') 
               }
            }
        }
    }
    Click on Build with Parameters and choose apply to build the infrastructure or choose destroy if you like to destroy the infrastructure you have built. 



    Click on Build With Parameters,
    choose apply from the dropdown
    Now you should see the console output if you choose apply.



    Pipeline will look like below:


    Login to AWS console


    Login to S3 Bucket, you should see terraform state info is also added


    How to Destroy all the resources created using Terraform?

    run the Jenkins Pipeline with destroy option.

    Monday, December 20, 2021

    Install SonarQube using Docker | Install SonarQube using Docker on Ubuntu 22.0.4 | Install SonarQube using Docker-Compose

    How to setup SonarQube using Docker compose?

    SonarQube is static code analysis tool. It is open source and Java based tool. SonarQube can be setup using Docker Compose with less manual steps.

    What is Docker Compose?
    Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration. Since Docker Compose lets you configure related containers in a single YAML file, you get the same Infrastructure-as-Code abilities as Kubernetes. But they come in a simpler system that’s more suited to smaller applications that don’t need Kubernetes’ resiliency and scaling.
     
    The purpose of docker-compose is to function as docker cli but to issue multiple commands much more quickly. To make use of docker-compose, you need to encode the commands you were running before into a docker-compose.yml file
     
    Run docker-compose up and Compose starts and runs your entire app.

    SonarQube Architecture



    SonarQube have three components namely
    1. Scanner - This contains scanner and analyser to scan application code.
    2. SonarQube server - contains Webserver(UI) and search server 
    3. DB server - used for storing the analysis reports.

    Watch steps in YouTube channel:


    Pre-requisites:

    • New Ubuntu EC2 up and running with at least t2.medium (4 GB RAM)
    • Port 9000 is opened in security firewall rule
    • Make sure below is taken care off.

    Login to instance where you will be installing SonarQube, perform the below command to configure virtual memory permanently for SonarQube to function:
    sudo vi /etc/sysctl.conf

    Add the following lines to the bottom of that file:

    vm.max_map_count=262144
    fs.file-max=65536

    To make sure changes are getting into effect:
    sudo sysctl -p

    Change Host Name to SonarQube
    sudo hostnamectl set-hostname SonarQube

    Perform System update
    sudo apt update

    Install Docker-Compose
    sudo apt install docker-compose -y

    Create docker-compose.yml
    this yml has all configuration for installing both SonarQube and Postgresql:
    sudo vi docker-compose.yml 

    (Copy the below code high-lighted in yellow color)
    version: "3"
    services:
      sonarqube:
        image: sonarqube:community
        restart: unless-stopped
        depends_on:
          - db
        environment:
          SONAR_JDBC_URL: jdbc:postgresql://db:5432/sonar
          SONAR_JDBC_USERNAME: sonar
          SONAR_JDBC_PASSWORD: sonar
        volumes:
          - sonarqube_data:/opt/sonarqube/data
          - sonarqube_extensions:/opt/sonarqube/extensions
          - sonarqube_logs:/opt/sonarqube/logs
        ports:
          - "9000:9000"
      db:
        image: postgres:12
        restart: unless-stopped
        environment:
          POSTGRES_USER: sonar
          POSTGRES_PASSWORD: sonar
        volumes:
          - postgresql:/var/lib/postgresql
          - postgresql_data:/var/lib/postgresql/data
    volumes:
      sonarqube_data:
      sonarqube_extensions:
      sonarqube_logs:
      postgresql:
      postgresql_data:


    Save the file by entering :wq!

    Now execute the compose file using Docker compose command:
    sudo docker-compose up -d 


    Make sure SonarQube is up and running by checking the logs
    sudo docker-compose logs --follow


    Once you see the message, that's it. SonarQube is been installed successfully. press control C and enter.
    Now access sonarQube UI by going to browser and enter public dns name with port 9000

    Please follow steps for integrating SonarQube with Jenkins
    https://www.coachdevops.com/2020/04/how-to-integrate-sonarqube-with-jenkins.html

    Azure DevOps Pipeline Optimization Best Practices | Optimizing Azure DevOps pipelines

    Optimizing Azure DevOps pipelines is crucial for achieving faster and more efficient software delivery. Here are some best practices and str...