Saturday, December 31, 2022

How to run Ansible playbook from Jenkins pipeline job | Automate EC2 provisioning in AWS using Jenkins and Ansible Playbook | Create new EC2 instance in AWS cloud using Ansible Playbook and Jenkins Pipeline

We will learn how to create new EC2 instance using Ansible playbook and automate using Jenkins Pipeline. 

Watch Steps in YouTube Channel:

Pre-requisites:

  • Ansible is installed and Boto is also installed on Jenkins instance
  • Ansible plug-in is installed in Jenkins. 
  • Make sure you create an IAM role with AmazonEC2FullAccess policy and attach the role to Jenkins EC2 instance.
  • Playbook for creating new EC2 instance needs to be created but you can refer my GitHub Repo
Steps:

Create Ansible playbook for provisioning EC2 instance

(Sample playbook is available in my GitHub Repo, you can use that as a reference)

Create Jenkins Pipeline 
pipeline {
    agent any

    stages {
        
        stage ("checkout") {
            steps {
                        checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [],                                                     userRemoteConfigs: [[url: 'https://github.com/akannan1087/myAnsibleInfraRepo']]])         
            }
        }
        stage('execute') {
            steps {
                //to suppress warnings when you execute playbook    
                sh "pip install --upgrade requests==2.20.1"
                // execute ansible playbook
                ansiblePlaybook playbook: 'create-EC2.yml'
            }
        }
    }
}

Execute Pipeline


Pipeline Console output


Tuesday, December 27, 2022

Ansible playbook for LAMP Installation on Ubuntu | How to Install LAMP stack using Ansible on Ubuntu 18.0.4

LAMP Stack comprises the following open-source software applications.

    • Linux – This is the operating system hosting the Applications.
    • Apache – Apache HTTP is a free and open-source cross-platform web server.
    • MySQL– Open Source relational database management system.
    • PHP – Programming/Scripting Language used for developing Web applications.
    Watch the steps in YouTube Channel:

    Pre-requisites:
    Steps to setup SSH keys:
    1. Login to Ansible management server/machine. Create SSH keys in Ansible host machine by executing the below command: (if you already have keys created, please skip this step)
    ssh-keygen 

    enter three times..now you will see keys successfully created.
    2.  Execute the below command on Ansible management node and copy the public key content:
    sudo cat ~/.ssh/id_rsa.pub

    copy the above output.
    3. Now login into target node where you want to install LAMP stack, execute the below command to open the file
    sudo vi /home/ubuntu/.ssh/authorized_keys
    type shift A and then enter now 
        and paste the key in the above file. please do not delete any existing values in this file.

    4. Now go back to Ansible mgmt node, do changes in /etc/ansible/hosts file to include the node you will be installing software. Make sure you add public or private IP address of target node as highlighted below in red color:
    sudo vi /etc/ansible/hosts
    [My_Group]  
    xx.xx.xx.xx ansible_ssh_user=ubuntu ansible_ssh_private_key_file=~/.ssh/id_rsa  ansible_python_interpreter=/usr/bin/python3

    Ansible playbook for installing LAMP(Linux Apache MySQL PHP) stack on Ubuntu

    sudo vi installLAMP.yml
    ---
    - hosts: My_Group
      tasks:
        - name: Task # 1 - Update APT package manager repositories cache
          become: true
          apt:
            update_cache: yes
        - name: Task # 2 - Install LAMP stack using Ansible
          become: yes
          apt:
            name: "{{ packages }}"
            state: present
          vars:
            packages:
               - apache2
               - mysql-server
               - php

    sudo ansible-playbook installLAMP.yml


    This is the execution result of the playbook.

    Now go to browser and use target node DNS to confirm if Apache is installed. make sure port 80 is opened in security firewall rules.


    Now login to target EC2 instance, type below commands to verify PHP and MySql versions:

    php --version

    mysql --version

    Ansible playbook for Tomcat Installation on Ubuntu | Ansible Tomcat Playbook on Ubuntu 18.0.4/20.0.4

    Ansible Playbook for installing Tomcat on Ubuntu 18.0.4

    Pre-requisites:
    Steps to setup SSH keys:
    1. Login to Ansible management server/machine. Create SSH keys in Ansible host machine by executing the below command: (if you already have keys created, please skip this step)
    ssh-keygen 

    enter three times..now you will see keys successfully created.
    2.  Execute the below command on Ansible management node and copy the public key content:
    sudo cat ~/.ssh/id_rsa.pub

    copy the above output.
    3. Now login into target node where you want to install Tomcat, execute the below command to open the file
    sudo vi /home/ubuntu/.ssh/authorized_keys
    type shift A and then enter now 
        and paste the key in the above file. please do not delete any existing values in this file.

    4. Now go back to Ansible mgmt node, do changes in /etc/ansible/hosts file to include the node you will be installing software. Make sure you add public or private IP address of target node as highlighted below in red color:
    sudo vi /etc/ansible/hosts
    [My_Group]  
    xx.xx.xx.xx ansible_ssh_user=ubuntu ansible_ssh_private_key_file=~/.ssh/id_rsa  ansible_python_interpreter=/usr/bin/python3

    5. Create a Playbook for setting up Tomcat 9

    sudo vi installTomcat.yml

    ---
    - hosts: My_Group
      tasks:
        - name: Task # 1 Update APT package manager repositories cache
          become: true
          apt:
            update_cache: yes
        - name: Task # 2 - Install Tomcat using Ansible
          become: yes
          apt:
            name: "{{ packages }}"
            state: present
          vars:
            packages:
               - tomcat9
               - tomcat9-examples
               - tomcat9-docs

    6. Execute Playbook:

    sudo ansible-playbook installTomcat.yml
    This is the execution result of Ansible playbook.


    Now access Tomcat on port 8080 in the target machine where you have installed it.



    Monday, December 5, 2022

    Online DevOps Coaching on AWS and Azure Cloud by Coach AK | Feb 2023 Schedule

    Are you in IT? Tired of your work? Are you not able to make any good progress in your career? 

    Are you not having a job? Looking for a break in IT? Are you interested in learning DevOps? 
     
    Did you get laid off from your previous job due to Covid-19
     
    If the answer is YES to all the above questions, You are in the right place to kick start your career in DevOps. DevOps is one of the top and hot IT skills right now. Currently almost all the employers are struggling to get right resources in their teams who can do the DevOps and automation work..You could be that person by attending this coaching program.

    DevOps Coaching Classes schedules for Feb 2023(currently enrollment is going on)

    DateTimeTypeWhen?
    Feb 11th11:35 AM to 01:30 PM CST on Saturdays
             &
    02:00 PM to 04:00 pm CST on Sundays
    WeekendsSat/Sun
    Feb 27th6:00 to 8:00 PM CSTWeekdaysMondays/Wednedays

    DevOps Coaching Highlights:

    - Comprehensive hands on knowledge on Git, Jenkins, Maven, SonarQube, Nexus, Terraform, Ansible, Puppet, Docker, Kubernetes, AWS IAM, ECR, Docker registry. AWS and Azure cloud platforms.

    - Coach is having about 23+ yrs of professional IT experience, 9+ Yrs in DevOps/Cloud/Automation.

    - Many students already got placed in reputed companies from this coaching program successfully.

    - Working as a Sr.DevOps Coach/Architect in a one of the top IT services companies in USA.

    - Unique program...less theory, more hands on lab exercises
     
    Resume preparation will be done with candidates personally.

    One-to-one Interview coaching.

    - Coaching is purely hands on with 101% job relevant.

    100% Job assistance.

    - Coached about 1300+ students successfully for past five years and many of my students got placed with many large enterprises in DFW, Charlotte, Houston, Austin, Chicago, Florida, Seattle, Bay area, Ohio, NJ and NY areas..

    To join DevOps Coaching classes, contact coach below:

    Contact no # : +1(469)733-5248
    Email id: devops.coaching@gmail.com
    Contact: Coach AK

    If you live in India, please contact assistant coach Gunal to learn about the program:

    Name - Gunal
    Email id: gunal.j0907@gmail.com
    Contact no: +91 87600 02237

    Thursday, November 17, 2022

    How to Deploy Springboot App into AKS cluster using Jenkins Pipeline and Kubectl CLI Plug-in | Deploy Microservices into AKS cluster using Jenkins Pipeline

    We are going to learn how to Automate build and deployment of Springboot Microservices App into Azure Kubernetes Cluster(AKS) using Jenkins pipeline. 

    Sample springboot App Code:

    I have created a sample Springboot App setup in GitHub. Click here to access code base in GitHub. 

    Jenkins pipeline will:

    - Automate maven build(jar) using Jenkins
    - Automate Docker image creation
    - Automate Docker image upload into Azure container registry
    - Automate Deployments to Azure Kubernetes Cluster

    Watch Steps in YouTube Channel:

    Pre-requisites:

    1. AKS cluster needs to be up running. You can create AKS cluster using any of one of the below options:

    2. Jenkins instance is setup and running
    3. Make sure to Install Docker, Docker pipeline and Kubectl CLI plug-ins are installed in Jenkins

    4.  Install Docker in Jenkins and Jenkins have proper permission to perform Docker builds
    5. Install Kubectl on Jenkins instance
    6. ACR is also setup in Azure cloud. 
    8. Dockerfile created along with the application source code for springboot App.
    9. Modify K8S manifest file per acr, image name for AKS Deployment.
    10. Install Azure CLI on your local machine. (We will be creating the AKS cluster from our local machine)

    The Code for this video is here:
    and make necessary changes in jenkins-aks-deploy-from-acr.yaml file after you fork into your account.

    Step # 1 - Create Credentials to connect to ACR from Jenkins

    Go to Azure Portal console, go to container registry
    Settings--> Access keys
    Get the username and password 
    Go to Jenkins-> Manage Jenkins. Create credentials.


    Enter ID as ACR and enter some text for description and Save

    Step #2 - Create Credentials for connecting to AKS cluster using Kubeconfig

    Go to Jenkins UI, click on Credentials -->


    Click on Global credentials
    Click on Add Credentials

    use secret file from drop down.

    you should see the nodes running in EKS cluster.

    kubectl get nodes


    Execute the below command to get kubeconfig info, copy the entire content of the file:
    cat ~/.kube/config




    Open your text editor or notepad, copy and paste the entire content and save in a file.
    We will upload this file.

    Enter ID as K8S and choose File and upload the file and save.


    Step # 3 - Create a pipeline in Jenkins
    Create a new pipeline job.

    Step # 4 - Copy the pipeline code from below
    Make sure you change values as per your settings highlighted in yellow below:

    pipeline {
      tools {
            maven 'Maven3'
        }
        agent any
            environment {
            //once you create ACR in Azure cloud, use that here
            registryName = "myacrrepo3210"
            //- update your credentials ID after creating credentials for connecting to ACR
            registryCredential = 'ACR'
            dockerImage = ''
            registryUrl = 'myacrrepo3210.azurecr.io'
        }
        
        stages {
            stage('checkout') {
                steps {
                    checkout([$class: 'GitSCM', branches: [[name: '*/main']], extensions: [], userRemoteConfigs: [[url: 'check_out_from_your_repo_after_forking_my_repo']]])
                }
            }
            
            stage ('Build') {
            steps {
                sh 'mvn clean install'           
            }
         }
         
        stage ('Build Docker image') {
            steps {
                    script {
                        dockerImage = docker.build registryName
                    }
                }
            }
            
        // Uploading Docker images into ACR
            stage('Upload Image to ACR') {
             steps{   
                 script {
                    docker.withRegistry( "http://${registryUrl}", registryCredential ) {
                    dockerImage.push()
                    }
                }
              }
            }
            
            stage ('K8S Deploy') {
              steps {
                script {
                    withKubeConfig([credentialsId: 'K8S', serverUrl: '']) {
                    sh ('kubectl apply -f  jenkins-aks-deploy-from-acr.yaml')
                    }
                }
            }
         }
        }
    }

    Step # 5 - Build the pipeline


    Step # 6 - Verify deployments to AKS

    kubectl get pods

    kubectl get services

    Steps # 7 - Access Springboot App Deployed in AKS cluster
    Once deployment is successful, go to browser and enter above load balancer URL mentioned above

    You should see page like below:


    Clean up the Cluster:

    To avoid charges from Azure, you should clean up unneeded resources. When the cluster is no longer needed, use the az group delete command to remove the resource group, container service, and all related resources. 

    az group delete --name myResourceGroup --yes --no-wait

    Saturday, November 5, 2022

    AWS, Azure Cloud and DevOps Coaching Online Classes | Jan 2023 Schedule

    Are you in IT? Tired of your work? Are you not able to make any good progress in your career? 

    Are you not having a job? Looking for a break in IT? Are you interested in learning DevOps? 
     
    Did you get laid off from your previous job due to Covid-19
     
    If the answer is YES to all the above questions, You are in the right place to kick start your career in DevOps. DevOps is one of the top and hot IT skills right now. Currently almost all the employers are struggling to get right resources in their teams who can do the DevOps and automation work..You could be that person by attending this coaching program.

    DevOps Coaching Classes schedules for Jan 2023(currently enrollment is going on)

    DateTimeTypeWhen?
    Jan 26th6:00 to 8:00 PM CSTWeekdaysTuesdays/Thursdays
    Jan 29th11:35 AM to 01:30 PM CST on Saturdays
             &
    02:00 PM to 04:00 pm CST on Sundays
    WeekendsSat/Sun


    DevOps Coaching Highlights:

    - Comprehensive hands on knowledge on Git, Jenkins, Maven, SonarQube, Nexus, Terraform, Ansible, Puppet, Docker, AWS IAM, ECR, Docker registry. AWS and Azure cloud platforms.

    - Coach is having about 22+ yrs of professional IT experience, 8+ Yrs in DevOps/Cloud/Automation.

    - Many students already got placed in reputed companies from this coaching program successfully.

    - Working as a Sr.DevOps Coach/Architect in a one of the top IT services companies in USA.

    - Unique program...less theory, more hands on lab exercises
     
    Resume preparation will be done with candidates personally.

    One-to-one Interview coaching.

    - Coaching is purely hands on with 101% job relevant.

    100% Job assistance.

    - Coached about 1400+ students successfully for past five years and many of my students got placed with many large enterprises in DFW, Charlotte, Houston, Austin, Chicago, Florida, Seattle, Bay area, Ohio, NJ and NY areas..

    To join DevOps Coaching classes, contact coach below:

    Contact no # : +1(469)733-5248
    Email id: devops.coaching@gmail.com
    Contact: Coach AK

    If you live in India, please contact assistant coach Gunal to learn about the program:

    Name - Gunal
    Email id: gunal.j0907@gmail.com
    Contact no: +91 87600 02237

    How to Enable Web hooks in Azure Pipeline in Azure DevOps | Enable Web hooks in Azure Pipeline in Azure DevOps | Enable Automate Build in ADO

    Webhooks allows developers to triggers jobs in CI server (such as Jenkins or Azure DevOps) for every code changes in SCM. In this article, we will learn how to trigger Azure Pipeline build jobs instantly for every code change in SCM.




    Pre-requisites:
    1. Azure Build pipeline is already configured. If you dont know how to create Azure build pipeline, click on this link.
    2. SCM repo have been setup, either in GitHub or Bitbucket or any SCM

    Watch Steps in YouTube

    Steps to Enable Webhooks in Azure Build Pipeline

    Go to Azure DevOps project dash board.

    Go to Pipelines


    Click on Pipelines

    Click on Edit


    Click on Triggers tab, Click Continuous Integration checkbox to enable Webhooks.


    Click on Save the Job. You don't have to Queue the job.

    Now go to your SCM and make a code change, you will see pipeline job will trigger immediately.

    Friday, November 4, 2022

    How to solve No hosted parallelism has been purchased or granted in Azure Devops Pipeline | Azure DevOps Pipeline Error Resolution

     

    Root cause and Fix:

    Microsoft has temporarily disabled the free grant of parallel jobs for public projects and for certain private projects in new organizations. However, you can request this grant by submitting a request. Submit a ticket using below url to request increased parallelism in Azure DevOps. 

    Monday, October 10, 2022

    CICD Process Flow Diagram | Implement CICD using Jenkins and Other DevOps tools

    CICD Process Flow Diagram - Implement CICD using Jenkins


    What is Continuous Integration?

    Continuous integration is a DevOps software development practice where developers regularly merge their code changes into a central repository, after which automated builds and tests are run.

    The key goals of continuous integration are to find and address bugs quicker, improve software quality, and reduce the time it takes to validate and release new software updates.

    Jenkins is a popular continuous integration tool. Jenkins can integrate with other tools using plug-ins.

    How does Continuous Integration Work?

    Developers frequently commit to a shared repository using a version control system such as Git. Prior to each commit, developers may choose to run local unit tests on their code as an extra verification layer before integrating. A continuous integration service automatically builds and runs unit tests on the new code changes to immediately surface any errors.

    Benefits of Continuous Integration
    • Improve Developers productivity 
    • Find bugs early in the software development stage
    • Deliver products into market place sooner
    • Improve the feedback loop
    What is Continuous Delivery?

    Continuous delivery is a software development practice where code changes are automatically prepared for a release to production. Continuous delivery is the next extension of continuous integration. The delivery phase is responsible for packaging an artifact together to be delivered to end-users. This phase runs automated building tools to generate this artifact.

    Benefits of Continuous Delivery
    • Automate the Software Release Process
    • Improve Developer Productivity
    • Find bugs early in the software development stage
    • Deliver updates faster

    Tuesday, October 4, 2022

    How to Recover SonarQube Admin password | How to unlock SonarQube admin password in Postgres SQL

    Let's say you have setup SonarQube using Docker or Docker Compose, you have forgotten the admin password for SonarQube. This article helps you to reset/recover the admin password. If you changed and then lost the admin password, you can reset it using the following steps.

    Watch Steps in YouTube channel:


    Pre-requisites:

    As we have configured SonarQube using Docker compose, We need to login to PostgreSQL running inside postgres docker container and execute update command to reset to default password.

    Step 1: Login into PostgreSQL docker container

    type below command to see the list of containers running in your EC2 instance.

    sudo docker ps

    Copy the container ID from above command. 

    Now login into PostgresSQL docker container

    docker exec -it <container_id> /bin/bash

    Step 2:  Connect to PostgreSQL database by executing below command:

    psql -p 5432 -d sonarqube -U sonar -h <container_id>

    now enter the password for sonarqube database:

    from my lab exercise, password for sonar user is admin123

    Make sure it shows sonarqube which is your database schema inside PostgresSQL db.

    Step 3: Execute the below query to change admin password to default password which is also admin

    update users set crypted_password='100000$t2h8AtNs1AlCHuLobDjHQTn9XppwTIx88UjqUm4s8RsfTuXQHSd/fpFexAnewwPsO6jGFQUv/24DnO55hY6Xew==', salt='k9x9eN127/3e/hf38iNiKwVfaVk=', hash_method='PBKDF2', reset_password='true', user_local='true' where login='admin';

    Step 4: Login to SonarQube UI and login as admin/admin

    Login as admin/admin

    Now it will immediately ask you to change the default admin password to something else:

    That's it! That is how you recover SonarQube admin password.

    References:

    https://docs.sonarqube.org/latest/instance-administration/security/

    AWS, Azure Cloud and DevOps Coaching Online Classes | Feb 2023 Schedule

    Are you in IT? Tired of your work? Are you not able to make any good progress in your career?  Are you not having a job? Looking for a break...