Saturday, December 23, 2023

DevOps Bootcamp Mar 2024 Schedule | DevOps & AWS Azure Cloud Coaching by Coach AK | DevOps and Cloud Computing Online Classes |

The DevOps requirements in the IT market space is expected to grow by 35% by 2024. Getting a DevOps education now is a great investment into your future, which will pay off very fast!

You are in the right place to kick start your career in DevOps. DevOps is one of the top and hot IT skills right now. Currently almost all the employers are struggling to get right resources in their teams who can do the DevOps and automation work..You could be that person by attending this coaching program.



DevOps Coaching schedule Feb/Mar 2024 (promotions are available, pls contact Amy)
Date Time Type When?
Mar 09th 09:45 AM CST to 11:25 AM CST on Saturdays
10:35 AM CST to 12:30 PM CST on Sundays    
Weekends Sat/Sundays
Mar 11th 6:00 to 8:00 PM CST Weekdays Mondays/Wednesdays    

DevOps Coaching Highlights:
Comprehensive hands on knowledge on Git, Jenkins, Maven, SonarQube, Nexus, Terraform, Ansible, Docker, Kubernetes, Helm, Prometheus, Docker registry, Helm, AWS and Azure cloud platform.

To join DevOps Coaching classes, please contact Amy below:
Contact no# : +1(940) 344-5011
WhatsApp #: +1 (940) 344-5011
Email id: contact.devopscoaching@gmail.com
Contact Name: Amy

Tuesday, December 19, 2023

How to trigger a Jenkins job from another Jenkins job | Jenkins job Integrating another Jenkins Job | Jenkins Pipeline job triggering another Jenkins Job

Jenkins job can be triggered so many different ways. This article provides steps to trigger a Jenkins job from another Jenkins job. 


Pre-requisites:

Scenario #1(post build) - How to trigger a Jenkins job from another Free style Job?

1. Login to Jenkins instance.
2. Open your any existing freestyle build job.
3. Click on Configure



4. Go to post build action

5. Add post-build action --> click on Build other projects

6. Select job name(projects)  that you want to trigger by typing the name of the job and also check trigger only if build is stable


7. Save the job. 
8. Build job now, once the current job is built, it will trigger the next job immediately.
Check the console output of the current job, you will see id would trigger second job 

9. Go to the secondJob
Check console output. You will see the second job got triggered by first build job.

Scenario #2 (pre-build) - How to trigger a Jenkins job from another Free style Job?

1. Open your freestyle build job.
2. Click on Configure
3. Click on Build triggers
4. Check build other projects are built.
    


Select source job name which will be built first and then once the build is stable, it will be trigger this job. And also check trigger only if build is stable.

5. Save the job. 

6. Run the first job. once that job is successful, and then it will trigger this job.


Scenario #3How to trigger any Jenkins job from a pipeline Job:

pipeline {

    agent any

    stages {

        stage('Trigger Another Job') {

            steps {

                    build job: 'mySecondJob', wait: false

            }

        }

    }

}



Watch Steps in YouTube channel:

Friday, December 15, 2023

Different types of GitHub runners | GitHub runners various types | Self hosted vs GitHub Hosted runner


A self-hosted GitHub runner is a machine (physical or virtual) that you set up and manage to run GitHub Actions workflows. GitHub Actions is a CI/CD (Continuous Integration/Continuous Deployment) and automation service provided by GitHub. It allows you to define workflows in your GitHub repositories to automate build, test, and deployment processes. 

  • GitHub-hosted runner
  • Self-hosted runner
  • Larger runner 
  • ephemeral self-hosted runners (on-demand)

GitHub-hosted runners:

  • These are runners provided by GitHub, and they are hosted on GitHub's infrastructure.
  • GitHub provides various virtual machine configurations for different operating systems and environments.
  • They are automatically scaled based on the demand, and you don't have to manage their infrastructure.
  • GitHub-hosted runners have time and resource limitations, and you may need to consider these limitations based on your project's needs.


Self-hosted runners:

A self-hosted runner differs from the default GitHub-hosted runners in that it runs on infrastructure that you control. Self-hosted runners can be physical, virtual, in a container, on-premises, or in a cloud.


GitHub-hosted large runners:

GitHub offers customers on GitHub Team and GitHub Enterprise Cloud plans a range of managed virtual machines with more RAM, CPU, and disk space. These runners are hosted by GitHub and have the runner application and other tools preinstalled.


Ephemeral self-hosted runners:

GitHub Actions now supports ephemeral (i.e. single job) self-hosted runners to make autoscaling your runners easier. After a job is run, ephemeral runners are automatically unregistered from the service, allowing you to do any required post-job management. Ephemeral runners are a good choice for self-managed environments where you need each job to run on a clean image.

    Monday, December 11, 2023

    How to Implement CICD Pipeline using GitHub Actions | GitHub Actions Tutorials | GitHub Actions CICD Pipeline | Build Java WAR file using GitHub Actions CICD Workflow

    What is GitHub Actions?

    • GitHub Actions is a CICD platform to help you to automate tasks in software development lifecycle
    • It allows you to automate various tasks in your software development workflow by defining workflows using YAML files.
    • GitHub Actions are event-driven. i.e., when some event happens, you can trigger series of commands.
    • GitHub Actions goes beyond just DevOps and lets you run workflows when other events happen in your repository.
    • GitHub provides Linux, Windows, and macOS virtual machines to run your workflows, or you can host your own self-hosted runners in your own data center or cloud infrastructure.

    GitHub Actions Workflow:

    A workflow is a series of actions initiated once a triggering event occurs. For example, the triggering event can be some commit pushed to a GitHub repository, the creation of a pull request, or another workflow completed successfully. Event is the one which triggers the workflow.

    Your workflow contains one or more jobs which can run in sequential order or in parallel. Each job will run inside its own virtual machine runner, or inside a container, and has one or more steps that either run a script that you define or run an action, which is a reusable extension that can simplify your workflow.

    Workflows are defined by a YAML file checked in to your repository and will run when triggered by an event in your repository, or they can be triggered manually, or at a defined schedule.

    Advantages of using GitHub Actions:

    GitHub Actions offers several advantages for automating workflows in your software development process:

    Integration with GitHub:

    GitHub Actions is tightly integrated into the GitHub platform. This integration makes it easy to define, manage, and execute workflows directly within your repositories.

    YAML-based Configuration:

    Workflows are defined using YAML files, providing a simple and human-readable syntax. This makes it easy to understand, version, and share your workflow configurations.

    Diverse Triggers:

    GitHub Actions supports a variety of triggers for workflow execution, such as pushes, pull requests, issue comments, and scheduled events. This flexibility allows you to tailor workflows to your specific needs.

    Parallel and Sequential Jobs:

    Workflows can include multiple jobs that run in parallel or sequentially. This enables you to optimize build and test times by parallelizing tasks or organizing them in a specific order.

    Reusable Actions:

    GitHub Actions promotes code reuse through reusable actions. Actions are modular units of code that encapsulate a specific task and can be shared across different workflows and repositories

    Supports a wide range of platforms and languages:

    GitHub Actions supports a wide range of platforms and languages. This means that users can use the same automation tool for different projects and languages, which can simplify their workflows and reduce the need for multiple tools.

    GitHub-hosted Runners:

    GitHub provides virtual machines (runners) for executing workflows. These runners are pre-configured with various tools and environments, reducing the need for managing your own infrastructure.

    Self-hosted Runners:

    While GitHub provides hosted runners, you can also use self-hosted runners on your own infrastructure for greater control over the execution environment.

    Community Actions:

    GitHub Actions has a marketplace where you can find and share actions created by the community. This makes it easy to leverage existing solutions for common tasks in your workflows.

    Secure:

    GitHub Actions allows you to securely store and use secrets (e.g., API keys, access tokens) in your workflows, ensuring sensitive information is protected.

    Sample GitHub Actions Workflow YAML for creating a WAR file using Maven

    You will create this file .github/workflows/build.yaml inside GitHub Repo where your Java code is.

    name: Build a WAR file using Maven
    on:
      push:
        branches: [ "master" ]
    jobs:
      build:
        runs-on: ubuntu-latest
        steps:
        - uses: actions/checkout@v3
        - name: Set up JDK 11
          uses: actions/setup-java@v2
          with:
            distribution: 'adopt'
            java-version: '11'
        - name: Build with Maven
          run: mvn clean install -f MyWebApp/pom.xml

    Watch Steps in YouTube channel:

    Tuesday, December 5, 2023

    DevOps Interview Preparation Courses Offered by Coach AK - Please text or WhatsApp Coach AK on +1 (469)733-5248

      

    Coach AK offers following services across the world online for $299 for new students and $249 for past students.

    • Resume preparation services 
    • Resume,  interview preparation services and mock interviews
    • Resume and interview preparation services, provide DevOps questions and Answers with mock interview

    Click here for the DevOps Interview preparation class schedule


    Thursday, November 30, 2023

    Azure DevOps Pipeline Optimization Best Practices | Optimizing Azure DevOps pipelines

    Optimizing Azure DevOps pipelines is crucial for achieving faster and more efficient software delivery. Here are some best practices and strategies for optimizing Azure DevOps pipelines:

    1. Parallel Jobs and Stages:

    • Parallelization: Break down your pipeline into parallel jobs and stages to execute tasks concurrently, reducing overall pipeline execution time.
    jobs:
    - job: Build
      pool:
        vmImage: 'windows-latest'
      steps:
        - script: echo "Building..."
    - job: Test
      pool:
        vmImage: 'windows-latest'
      steps:
        - script: echo "Testing..."

    2. Agent Pools and Agents:
    • Agent Pools: Distribute builds across multiple agent pools to utilize available resources effectively. Configure agent capabilities to match job requirements.

    3. Artifact Caching:

    • Cache Dependencies: Utilize caching to store and retrieve build artifacts between different pipeline runs, reducing the time spent on redundant build steps.
    steps: - task: Cache@2 inputs: key: 'node | "$(Agent.OS)" | package-lock.json' path: '**/node_modules'

    4. Incremental Builds:

    • Trigger on Changes: Set up your pipeline to trigger builds only for changes in relevant branches. Use CI triggers to avoid unnecessary builds.

    5. Artifact Promotion:

    • Promote Artifacts: Promote artifacts from one environment to another instead of rebuilding them. This helps maintain consistency across environments and reduces build times.

    6. Use YAML Pipelines:

    • YAML Syntax: Use YAML-based pipelines for better version control and code review. YAML pipelines are more maintainable and offer a clearer representation of your CI/CD process.

    7. Job and Step Conditions:

    • Conditions: Use conditions to selectively execute jobs or steps based on criteria such as branch names, variable values, or expressions.

    • jobs: - job: Deploy condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/main')) steps: - script: echo "Deploying..."

    8. Agent Clean-Up:

    • Clean Workspace: Include steps to clean up the agent workspace at the end of each build to avoid accumulation of unnecessary artifacts and files.
      steps: - script: echo "Build steps..." - task: DeleteFiles@1 inputs: contents: '**' cleanTargetFolder: true

    9. Multi-Stage Docker Builds:

    • Multi-Stage Builds: Utilize multi-stage Docker builds to create smaller and more efficient Docker images, reducing image size and improving deployment speed.

    10. Azure Container Registry (ACR) Tasks:

    - **ACR Build and Push:** Use Azure Container Registry Tasks for building and pushing Docker images directly within the pipeline, reducing the need for external scripts.

    - task: ACRBuild@2
      inputs:
        azureSubscription: '<AzureServiceConnection>'
        resourceGroupName: '<ResourceGroupName>'
        registry: '<ACRName>'
        imageName: '<ImageName>'
        dockerfilePath: '<DockerfilePath>'

    11. Deployment Strategies:
    - **Deployment Strategies:** Choose appropriate deployment strategies such as rolling deployments, canary releases, or blue-green deployments based on your application's requirements.
    12. Automated Testing:
    - **Automated Tests:** Integrate automated tests into your pipeline to catch issues early. Azure DevOps supports various testing frameworks and test runners.

    13. Parameterize Pipelines:

    - **Pipeline Parameters:** Parameterize your pipelines to make them more flexible and reusable across different environments or scenarios.

    14. Infrastructure as Code (IaC):
    - **IaC:** Treat your infrastructure as code. Use Azure Resource Manager (ARM) templates or Terraform scripts for defining and deploying infrastructure.

    15. Use Deployment Gates:
    - **Gates:** Implement deployment gates to add quality checks before promoting changes to the next environment. Gates can include approvals, automated tests, or custom conditions.

    Optimizing Azure DevOps pipelines is an iterative process. Regularly review and enhance your pipeline configurations to incorporate new best practices and improvements. Consider the specific needs and constraints of your projects when implementing optimizations.

    Automate Azure App Service setup using Ansible and Azure DevOps pipeline | How to integrate Ansible with Azure DevOps | How to Create WebApp in Azure Cloud using Ansible

    Ansible is an open-source, configuration management tool that automates cloud provisioning, configuration management, and application deploy...