Saturday, March 19, 2022

Install Jenkins on Ubuntu 22.0.4 using Docker Compose | Setup Jenkins on AWS EC2 Ubuntu instance | How to setup Jenkins in Ubuntu EC2 instance using Docker?

Please follow the steps to install Jenkins using Docker compose on Ubuntu 22.0.4 instance. 

What is Docker Compose?
Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration.
The purpose of docker-compose is to function as docker cli but to issue multiple commands much more quickly. To make use of docker-compose, you need to encode the commands you were running before into a docker-compose.yml file
Run docker-compose up and Compose starts and runs your entire app.

Change Host Name to Jenkins
sudo hostname Jenkins

Perform update first
sudo apt update

Now lets start Docker. compose installation first:

Install Docker-Compose
sudo apt-get install docker-compose -y

Add current user to docker group
sudo usermod -aG docker $USER

Create directory
mkdir ~/jenkins

Jenkins Setup

Create docker-compose.yml
this yml has all configuration for installing Jenkins
sudo vi docker-compose.yml 

version: '3.3'
    image: jenkins/jenkins:lts
    restart: unless-stopped
    privileged: true
    user: root
      - 8080:8080
    container_name: jenkins
      - ~/jenkins:/var/jenkins_home
      - /var/run/docker.sock:/var/run/docker.sock
      - /usr/local/bin/docker:/usr/local/bin/docker

Now execute the compose file using Docker compose command:
sudo docker-compose up -d 

Make sure Jenkins is up and running
sudo docker-compose logs --follow
You can also get the admin password

How to get Jenkins admin password in another way?
Identify Docker container name

sudo docker ps

Get admin password by executing below command
sudo docker exec -it jenkins cat /var/jenkins_home/secrets/initialAdminPassword

Access Jenkins in web browser

Now Go to AWS console. Click on EC2, click on running instances link. Select the checkbox of EC2 you installed Jenkins. Click on Action. Copy the value from step 4 that says --> Connect to your instance using its Public DNS:

Now go to browser. enter public dns name or public IP address with port no 8080.

Unlock Jenkins
You may get screen, enter the below command in Git bash( Ubuntu console)

Copy the password and paste in the browser.
Then click on install suggested plug-ins. 
Also create user name and password.
enter everything as admin. at least user name as admin password as admin
Click on Save and Finish. Click on start using Jenkins. Now you should see a screen like below:

That's it. You have setup Jenkins successfully using Docker compose. 
Please watch the steps in our YouTube channel.

Thursday, March 17, 2022

AWS, Azure Cloud and DevOps Coaching Online Classes - Jan 2023 Schedule

Are you in IT? Tired of your work? Are you not able to make any good progress in your career? 

Are you not having a job? Looking for a break in IT? Are you interested in learning DevOps? 
Did you get laid off from your previous job due to Covid-19
You are in the right place to kick start your career in DevOps. DevOps is one of the top and hot IT skills right now. Currently almost all the employers are struggling to get right resources in their teams who can do the DevOps and automation work..You could be that person by attending this coaching program.

DevOps Coaching Classes schedules for Jan 2023:

Jan 8th09:45 AM to 11:30 AM CST on Saturdays
10:30 AM to 12:30 pm CST on Sundays
Jan 19th6:00 to 8:00 PM CSTWeekdaysTuesdays/Thursdays

DevOps Coaching Highlights:

- Comprehensive hands on knowledge on Git, Jenkins, Maven, SonarQube, Nexus, Terraform, Ansible, Puppet, Docker, AWS IAM, ECR, Docker registry. AWS and Azure cloud platforms.

- Coach is having about 23+ yrs of professional IT experience, 8+ Yrs in DevOps/Cloud/Automation.

- Many students already got placed in reputed companies from this coaching program successfully.

- Working as a Sr.DevOps Coach/Architect in a one of the top IT services companies in USA.

- Unique program...less theory, more hands on lab exercises...
Resume preparation will be done with candidates personally.

One-to-one Interview coaching.

- Coaching is purely hands on with 101% job relevant.

100% Job assistance.

- Coached about 1300+ students successfully for past five years and many of my students got placed with many large enterprises in DFW, Charlotte, Houston, Austin, Chicago, Florida, Seattle, Bay area, Ohio, NJ and NY areas..

To join coaching classes, please contact coach below through email or phone number:

Contact no #: +1(469)733-5248
Email id:
Contact: Coach

If you live in India, please contact assistant coach Gunal to learn about the program:

Name - Gunal
Email id:
Contact no: +91 87600 02237

Thursday, March 3, 2022

How to store Terraform state file in Azure Storage | How to manage Terraform state in Azure Blob Storage | Terraform Remote state in Azure Blob storage | Terraform backend

One of the amazing features of Terraform is, it tracks the infrastructure that you provision. It does this through the means of state. By default, Terraform stores state information locally in a file named terraform.tfstate. This does not work well in a team environment where if any developer wants to make a change he needs to make sure nobody else is updating terraform in the same time. You need to use remote storage to store state file.

With remote state, Terraform writes the state data to a remote data store, which can then be shared between all members of a team. Terraform supports storing state in many ways including the below:

  • Terraform Cloud
  • HashiCorp Consul
  • Amazon S3
  • Azure Blob Storage
  • Google Cloud Storage
  • Alibaba Cloud OSS
  • Artifactory or Nexus 

We will learn how to store state file in Azure Blob storage. We will be creating Azure storage account and container.

Watch the steps in YouTube channel:



Logging into the Azure Cloud

Login into the Azure Cloud using Azure CLI using:

az login

enter your microsoft username and password to login to Azure cloud


terraform { required_providers { azurerm = { source = "hashicorp/azurerm" version = "=2.63.0" } } } provider "azurerm" { features {} } resource "azurerm_resource_group" "demo-rg" { name = "demo-resource-group" location = "eastus" }

terraform init 
terraform plan 
this will show it will create one resource

terraform apply 
Now this will create a local terraform state file in your machine.

How to store Terraform state file remotely?

Step # 1 - Configure Azure storage account

Before you use Azure Storage as a backend, you must create a storage account. We will create using shell script:

# Create resource group
az group create --name $RESOURCE_GROUP_NAME --location eastus
# Create storage account
az storage account create --resource-group $RESOURCE_GROUP_NAME --name $STORAGE_ACCOUNT_NAME --sku Standard_LRS --encryption-services blob
# Create blob container
az storage container create --name $CONTAINER_NAME --account-name $STORAGE_ACCOUNT_NAME

This should have created resource group, storage account and container in Azure portal.

Step # 2 - Configure terraform backend state 

To configure the backend state, you need the following Azure storage information which we created above:

    • resource_group_name: name of the resource group under which all resources will be created.
    • storage_account_name: The name of the Azure Storage account.
    • container_name: The name of the blob container.
    • key: The name of the state store file to be created.
    Create file
    We need to create a backend file.

    terraform {
    backend "azurerm" {
    resource_group_name = "tfstate"
    storage_account_name = "<storage_acct_name>"
    container_name = "tfstate"
    key = "terraform.tfstate"

    terraform init --reconfigure

    type yes

    This should have created backend file called terraform.tfstate in a container inside azure storage.

    You can view remote state file info:

    This is how you can store terraform state information remotely in Azure storage. 

    Now let's make changes to to create more resources


    resource "azurerm_container_registry" "acr" {
      name                = "myacr563123"
      resource_group_name =
      location            = azurerm_resource_group.demo-rg.location
      sku                 = "Standard"
      admin_enabled       = false

    terraform plan

    terraform apply --auto-approve

    This will update terraform state file remotely in azure blob container.

    Azure DevOps Pipeline Optimization Best Practices | Optimizing Azure DevOps pipelines

    Optimizing Azure DevOps pipelines is crucial for achieving faster and more efficient software delivery. Here are some best practices and str...