Here are the top 10 DevOps Tools to focus on to put your DevOps learning on a faster track and kick start your career quickly as a successful Cloud engineer or DevOps engineer in about 10 to 12 weeks from now.
Here are the top 10 DevOps Tools to focus on to put your DevOps learning on a faster track and kick start your career quickly as a successful Cloud engineer or DevOps engineer in about 10 weeks from now.
Finally having some scripting knowledge is also good by learning Python, YAML, Ruby. and also having some Cloud experience on AWS and Azure will be extremely helpful.
Learning DevOps involves a combination of theoretical knowledge and hands-on experience. Here are five effective ways to learn DevOps:
Sign up for DevOps Courses and Tutorials:
Platforms like www.coachdevops.com, Udemy, Pluralsight offer comprehensive DevOps courses. Look for courses that cover version control, CI/CD, containerization, infrastructure as code, and other key DevOps concepts. Many of these platforms provide hands-on labs and projects to reinforce your learning.
Books and Documentation:
Read authoritative books on DevOps, such as "The Phoenix Project" by Gene Kim, Kevin Behr, and George Spafford or "Site Reliability Engineering" by Niall Richard Murphy and others. Additionally, explore documentation for popular DevOps tools like Jenkins, Docker, Kubernetes, and Terraform. These resources provide in-depth insights into best practices and real-world implementations.
Hands-On Projects:
Learning by doing is crucial in DevOps. Work on real projects that involve setting up CI/CD pipelines, automating infrastructure, and deploying applications. GitHub is a valuable resource for finding open-source projects and contributing to them. Create your own projects to apply and reinforce your knowledge. You can also join the course offered by Coach AK.
Networking and Community Involvement:
Join DevOps communities, forums, and social media groups. Participate in discussions, ask questions, and share your experiences. Attend local meetups, conferences, and webinars to connect with professionals in the field. Networking with others in the DevOps community can provide valuable insights, tips, and opportunities for collaboration.
Certifications:
Consider pursuing certifications that validate your DevOps skills. Certifications such as AWS Certified DevOps Engineer, Docker Certified Associate, Terraform and Kubernetes certifications are widely recognized in the industry. While certifications alone may not make you an expert, they can provide a structured learning path and help you showcase your skills to employers.
Here are the top 10 DevOps Tools to focus on to put your DevOps learning
in fast-track and kick start your career quickly as a
Cloud or DevOps engineer in about 8 weeks from now.
Kubernetes is an open source container
platform that eliminates many of the manual processes involved in
deploying and scaling containerized applications. We will learn how to setup Kubernetes Cluster in Ubuntu 18.0.4.
You can setup Kubernetes Cluster in many ways. One of the ways is to use
Kubeadm.
Kubeadm is a tool built to provide kubeadm init and kubeadm join as best-practice “fast paths” for creating Kubernetes clusters.
Watch the steps in YouTube video:
Prerequisites:
1. Ubuntu instance with 4 GB RAM - Master Node - (with ports open to all traffic)
2. Ubuntu instance with at least 2 GB RAM - Worker Node - (with ports open to all traffic)
Kubernetes Setup using Kubeadm
###Start - Execute the below commands in both Master/worker nodes##########
Login to both instances execute the below commands: sudo apt-get update -y && sudo apt-get install apt-transport-https -y
the above command should display both Master and worker nodes.
It means Kubernetes Cluster - both Master and worker nodes are setup successfully and up and running!!!
Deploy Nginx on a Kubernetes Cluster
Let us run some apps to make sure they are deployed to Kuberneter cluster. We will do this in master node. The below command will create deployment:
kubectl create deployment nginx --image=nginx
View Deployments kubectl get deployments
Create as a service
kubectl create service nodeport nginx --tcp=80:80
kubectl get svc
run the above command to see a summary of the service and the ports exposed.
Now go Master or worker node, enter public dns and access it with port exposed
Building pipelines in Azure DevOps is really easy, you can migrate your web applications from any where into Azure Cloud by using Azure pipelines. We are going to migrate Java Web App that was setup in BitBucket into Azure Cloud. Since it is a Java WebApp, we need to create a WebApp in Azure Cloud.
WebApp is a Platform as a service capability provided by Azure to deploy any kind of apps including Web apps, APIs, mobile apps.
How are we going to do?
1. Create a Webapp in Azure Cloud.
2. Create Azure pipeline in Azure DevOps
3. configure pipeline to check out code from BitBucket and deploy into Azure Cloud on WebApps
4. Run the apps
You need to create WebApp in Azure Cloud. WebApp is an App service (Platform as a Service) provided by Azure Cloud to migrate any web applications.
Once you sign in to Azure Portal, you need to create an app service which is a WebApp.
3.Click on + Add or click on Create app service(Web App)
Click on Web App. Choose your Azure subscription, usually Pay as you Go or Free trial subscription Create a new resource group(for first time if you are creating an appservice, otherwise you can use existing group)
Enter App service name(it should be unique) Publish as Code Run time stack as Java 17
Java Web Server stack --> Tomcat 10.0 Operating System as Linux Region as Central US or where ever you are based at Enter LinuxPlan name Choose SKU and size as given below: Choose DEV/Test and use 1 GB or 1.75 GB memory
Click on Apply Now click on Apply & Create This will take bit time to create the app service.
Once WebApp is created, go resources, click on WebApp name and click on the URL.
You should see the app service home page some thing like below:
Since our WebApp is up and running, now we can start creating pipelines to migrate to Azure cloud. 2. Creating pipelines in Azure Devops
1. Now go to your visual studio project page --> https://dev.azure.com/ Select the Project dashboard you already created.
Click on Pipelines, Builds.
2. Click on New pipeline and click on use the classic editor to create a pipeline without YAML
3. Select source repository as BitBucket cloud and click use user name and password to continue
4. Enter bitbucket user name and password and click on authorize.
Choose your repo by click on ... dots. click continue
5. Since our application is Java stack, type Java and Choose Azure WebApp for Java
6. Modify maven goal as clean install and also choose pom.xml by clicking on three dots ...choose pom.xml
7. Leave the value as it is for Copy Files to staging folder
8. Leave the value as it is for Publish Artifact: Drop 9. Click on Stop Azure WebApp step, Enter Azure WebApp details - where you would like to deploy your app in Azure. Select Free trial subscription from the drop down. Click on Authorize button. Make sure you disable popup blocker.
10. Do the same in Deploy Azure WebApp & Start Azure WebApp steps.
Enable Webhooks in Pipeline
11. Click on Triggers and then enable check box for Continuous Integration
12. Now click on Save and Queue
13. If your configurations are correct, you should be able to see the build success and able to access in the Azure. Click on Pipelines and pipeline name
Now you will see that Azure DevOps would have started build.
14. Make sure build is success, it should have all green like below.
After successful build, you can check the output by accessing the URL of WebApp:
https://myAzureWebAppUrl
You can watch the above steps in action in YouTube video:
Clean-up Resources:
After successfully finishing the labs, make sure to delete all the resources created in Azure portal to avoid charges from Microsoft.
Select the resource group in Azure portal, delete the resource group. This will ensure that all the resources will be deleted under the resource group.
Here are the top 10 DevOps Tools to focus on to put your DevOps learning on a faster track and kick start your career quickly as a successful Cloud engineer or DevOps engineer in about 12 weeks from now.
Puppet is a configuration management tool, similar to Ansible,
Chef and SaltStack. Puppet also can be used for automating infrastrcture as well.
Puppet is based in client/server model. The server does all
the automation of tasks on nodes/servers that have a client(agent)
installed. The work of the Puppet agent is to send facts to the puppet
master and request a catalog based on certain interval level(default
time 30 mins). Once it receives a catalog, Puppet agent applies it to
the node by checking each resource the catalog describes. It makes
relevant changes to attain the desired state. The work of the Puppet
master is to control configuration information. Each managed agent node
requests its own configuration catalog from the master.
We will see how to setup Puppet Master in Ubuntu 18.0.4.
Pre-requistes:
Install Puppet master on new Ubuntu with medium instance port 8140 needs to be opened.
Steps:
First let us see how to install Puppet 6.x on Ubuntu 18.0.4.
Steps for Puppet Master Installation 1. Modify Puppet Master Hosts file to add hostname of Puppet Master sudo vi /etc/hosts
(the above command is to start the service during starting the Ubuntu instance) sudo systemctl start puppetserver.service (The above command is for starting the server and this may take some time) sudo systemctl status puppetserver.service
Now press q to come out of window.
Add puppet master ip address and puppet next to it like shown below
This confirms that Puppet Master is installed successfully. Verify which version of Puppet installed by executing below command: apt policy puppetserver
2. you need to install the aws-sdk-coreand retries gems as root (or superuser): sudo /opt/puppetlabs/puppet/bin/gem install aws-sdk-core retries
Done installing documentation for retries after 0 seconds
6 gems installed
3. Also install AWS SDK for accessing resources in AWS sudo /opt/puppetlabs/puppet/bin/gem install aws-sdk -v 2.0.42
Done installing documentation for retries after 0 seconds
4 gems installed
4. Now you can install puppet-labs-aws module sudo /opt/puppetlabs/bin/puppet module install puppetlabs-aws
That's it. Puppet Master is setup successfully!!!!
You can watch the above steps in YouTube as well.
Please find below steps for creating pipelines using Jenkins.
What is Pipeline in Jenkins?
- Pipelines are better than freestyle jobs, you can write a lot of complex tasks using pipelines when compared to Freestyle jobs.
- You can see how long each stage takes time to execute so you have more control compared to freestyle.
- Pipeline is groovy based script that have set of plug-ins integrated for automating the builds, deployment and test execution.
- Pipeline defines your entire build process, which typically includes stages for building an application, testing it and then delivering it.
- You can use snippet generator to generate pipeline code for the stages you don't know how to write groovy code.
- Pipelines are two types - Scripted pipeline and Declarative pipeline
Pre-requisites:
Install plug-ins 1. Install Pipeline Stage View Plugin, Deploy to container, Slack, Nexus Artifact Uploader and SonarQube plug-ins (if already installed, you can skip it) Steps to Create Scripted Pipeline in Jenkins
1. Login to Jenkins
2. Create a New item
3. Give name as MyfirstPipelineJob and choose pipeline
4. Click ok. Pipeline is created now
5. Under build triggers, click on poll SCM, schedule as
H/02 * * * *
6.
Go to Pipeline definition section, click on Pipeline syntax link. under
sample step drop down, choose checkout: Checkout from version control.
enter bitbucket Repository URL, and choose the bitbucket user/password
from the drop town. scroll down, click on Generate Pipeline script. Copy
the code.
7. Now copy the below pipeline code highlighted section into Pipeline section in the pipeline. Please copy stage by stage
8. Change Maven3, SonarQube, Nexus url variables and also Slack channel name as highlighted above in red as per your settings.
9.
For Nexus Upload stage, You need to change the Nexus URL and
credentials ID for Nexus (which you can grab from Credentials tab after
login)
10. For Dev Deploy stage, you can copy credentials ID used for connecting to Tomcat.
Pipeline Code:
node {
def mvnHome = tool 'Maven3'
stage ("checkout") { // copy code here which you generated from step #6
}
stage ('build') {
sh "${mvnHome}/bin/mvn clean install -f MyWebApp/pom.xml"
}
stage ('DEV Approve') {
echo "Taking approval from DEV Manager"
timeout(time: 7, unit: 'DAYS') {
input message: 'Do you want to deploy?', submitter: 'admin'
}
}
stage ('Slack notification') {
slackSend(channel:'channel-name', message: "Job is successful, here is
the info - Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]'
(${env.BUILD_URL})")
}