Photo of Yeshwanth Galitrusted member badge

Yeshwanth Gali

AWS (Amazon Web services) Developer

Phoenix, AZ, USA
0
Followers
0
Following

Careers

Sr Devops Engineer

AirTerra

Full time contract06/2022 - 03/2023
  1. Responsibilities:
  2.  Created Terraform modules to create VPC, subnets, Cloud Composer, Kubernetes Clusters, ELB, security groups, Read Replica of SQL data base, Firestore DB.
  3.  Integrated Datadog with services in GKE like kong, Haproxy, GLB. Written alerts and dashboards for different services using terraform.
  4.  Integrated Postgress service with Cloud SQL (GCS) and implements Read Replica to create a replica of existing database from one region to another region.
  5.  Written Terraform modules to create Sandbox projects in Playground and integrated with firestore to store the data and configured budgets and alerts for billing and set threshold values using terraform.
  6.  Configured Panorama to set up firewall rules for VPC network, and implemented polices for VPC SC to allow whitelisted IP address through Compute Network.
  7.  Implemented clusters using Kubernetes and worked on creating pods, Name Spaces on Workloads, deployments, Services, labels, health checks, Ingress resources and Controllers by writing YAML files.
  8.  Used bitbucket pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, created Pods, and managed using Google Kubernetes Engine.
  9.  Integrated Google Cloud Function (GCF) with bitbucket using Python Script and trigger bitbucket pipeline automatically using Cloud Scheduler.
  10.  Configured SFTP and integrated it with GCF with Python Script to trigger alerts and send notification emails to respected groups.
  11.  Maintained Cloud infrastructure by created multiple GCP projects for customers, maintained access to GCP resources through users and groups on an IAM console and supported them to deploy their services to GKE and push images to Artifact registry.
  12.  Provisioned lower, staged and production environment projects with multi-region resources, constantly set up the autoscaling of pods in GKE for high peak loads and scale down the pods after peak.
  13.  Set up a GCP Firewall rules to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.
  14.  Integrated GCS buckets with files.com to Secure File Transfer Protocol (SFPT) and mapped Service accounts with files.com and secured the Service accounts by only allowing it through VPC SC.
  15.  Written build pipeline scripts to deploy multiple services through bitbucket pipeline and created the alerts and warnings for applications using terraform modules and implemented through Datadog.
Sr Cloud Engineer

Fifth Third Bank

Full time contract08/2020 - 07/2022
  1.  Created backup of Amazon machine Images (AMIs) of EC2 instances using Packer, and critical business data for Disaster Recovery (DR) and upgrading to new instance types for better performance.
  2.  Integrated AWS Dynamo DB using AWS lambda to store the values of the items and backup the AWS Dynamo DB streams and Automated backup of data in EBS and instance store to AWS S3 buckets and created backup of AMI for mission critical production servers from AWS CLI and used AWS Data pipeline to configure data loads from AWS S3 into Redshift.
  3.  Created Terraform modules to create custom sized VPC, subnets, EC2 instances, Lambda, ELB, security groups. Worked on tagging standard for proper identification and ownership of EC2 instances and other AWS Services like Cloud Front, cloud watch, RDS, S3, Route53, SNS, SQS, Cloud Trail.
  4.  Experience on other AWS cloud services like EBS, Auto scaling groups, Load Balancer, Cloud watch, IAM for installing and configuring and troubleshooting on various Amazon images for server migration from physical into cloud.
  5.  Assigning agents into the Amazon connect Routing profiles, Updated Queues, Created contact flows.
  6.  Integrated Lambda functions with the amazon connect contact flows and also the DynamoDB tables and Lex bots into the contactflows, updated the utterances for the intents in the Lexbot.
  7.  Migrated call flows from Cisco call centers to amazon connect cloud contact centers. Wrote Lambda functions to communicate with Lexbot, used python and basic node.js
  8.  Worked with the Terraform key features like Execution plans, Resource Graphs, Change Automation and wrote Terraform Templates for AWS infrastructure to build staging and production environments for the testing department to get the work done in a flow.
  9.  Used Kubernetes to deploy, balance the load of the application in and out request by load balancing, scale, and manage docker containers with multiple name-spaced versions.
  10.  Implementing clusters using Kubernetes and worked on creating pods, replication controllers, Name Spaces, deployments, Services, labels, health checks, Ingress resources and Controllers by writing YAML files. Integrated them using weave, flannel, calico SDN networking.
  11.  Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, created Pods and managed using Kubernetes and created Advanced Jenkins Pipeline with Jenkins Pipeline Scripted Syntax to Trigger Other Remote Jobs on Other Jenkins Masters and Automated the deployment of Java and .Net applications with the use of Jenkins.
  12.  Managed Docker containers on a cluster hosted on a serverless infrastructure using AWS ECS by distributing the application traffic in ELB, Cloud Front to distribute content to edge locations, Cloud watch to set alarms and notifications.
  13.  Implemented Ansible Tower for managing the complex network deployments by addition of the control knowledge and delegation to Ansible powered environments.
  14.  Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Development servers.
Sr Devops Engineer

FanDule

Full time contract01/2020 - 08/2020
  1. My major duties include planning developing and assisting the migration of client's on-premises infrastructure to Microsoft Cloud (Azure). Design and implement hybrid on premise cloud migration and management of strategy for the new hybrid cloud solution in single and multiple data centers.
  2.  Involved in managing Private Cloud and Hybrid cloud configurations and practices in Windows Azure, SQL Azure, Azure Web and Database deployments. Upgraded and Migrated web applications to latest .Net framework versions and Azure platforms.
  3.  Created Azure automated assets, Graphical runbooks, PowerShell run books that will automate specific tasks. Expertise in deploying Azure AD connect, configuring ADFS installation using Azure AD connect.
  4.  Created ARM templates for Azure platform and in migrating on premise to Windows Azure using Azure Site Recovery and Azure backups and other Azure services.
  5.  Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS, VMs and PaaS role instances for refactored applications and databases.
  6.  Implemented high availability with Azure Classic and Azure Resource Manager deployment models and worked on Azure access controls, RBAC to manage privileges on Azure resources
  7.  Created clusters in Google Cloud and manage the clusters using Kubernetes(k8s). Using Jenkins to deploy code to Google Cloud, create new namespaces.
  8.  Extensively used Google stack-driver for monitoring the logs of both GKE and GCP instances and configured alerts from Stack-driver for some scenarios. Hands on Experience in Google Cloud components, Google container builders and GCP client libraries and cloud SDK's.
  9.  Hands on experience in GCP, BigQuery, GCS Bucket. Analyzed data in Google Cloud Storage using BigQuery.
  10.  Involved in CI/CD process using GIT, Nexus, Jenkins job creation, Maven builds and Create Docker image and use the docker image to deploy in gcloud clusters.
  11.  Provide the permissions and required access to all the pub/subtopics and sinks to push/write the data to Stack-driver. Setup Alerting and monitoring using Stack-driver in GCP. Created custom log metrics using Stack-driver logging and create charts and alerts using the custom log metrics.
  12.  Developed Docker Images to support Development and Testing Teams and their pipelines, Jenkins distributed builds, Selenium and JMeter images, Elasticsearch, Kibana and Logstash (ELK & EFK).
  13.  Set up build environment integrating with Git and Jira to trigger builds using Web Hooks and Slave Machines by integrating Docker container-based test infrastructure to Jenkins CI test flow.
  14.  Worked on Container management using Docker by writing Docker files and set up the automated build on Docker Hub and written Docker Compose file for multi container provisioning and to build, run, tag and publish a docker container to Azure Container Registry.
  15.  Designed strategies for optimizing all aspects of the continuous integration, release and deployment processes using container and virtualization techniques like Docker and Kubernetes. Setup Docker to automate container deployment through Jenkins and Dealt with Docker Hub, making Docker Images and taking care of various Images essentially for middleware establishments. Worked in all areas of Jenkins setting up CI for new branches, build automation, plugin management and securing Jenkins and setting up master/slave configurations.
  16.  Integrated GIT with Jenkins using the GitHub plugin to automate the process of source code check-out by providing the URL and credentials of the GIT repository.
  17.  Configuring and managing an ELK stack, setup the elastic search ELK Stack to collect search and analyze log files from across the servers and integration of Application with monitoring tool New Relic for complete insight and proactive monitoring.
  18.  Replaced Splunk logging and analytics with an automated ELK cluster, increasing data capture capacity and reducing costs and installed and configured ELK stack in both legacy and Docker swarm mode and pipelined application logs from App Server to Elastic search through Logstash.
Devops Engineer

EBYTE SOFTWARE SOLUTIONS

Full time contract01/2019 - 12/2019
  1. Responsibilities:
  2.  Configured AWS Route53 to manage DNS zones globally, create record sets, DNS failover and health checks of domains, assign domain names to ELB and CloudFront.
  3.  Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups and maintained access to AWS resources through users and groups on an IAM console.
  4.  Performed the automation deployments using AWS by creating the IAMs and used the code pipeline plugin to integrate Jenkins with AWS and created the EC2 instances to provide the virtual servers.
  5.  Implemented zero downtime deployment process in WebLogic using python and shell script and added it to the continuous automated process by using Jenkins as a tool.
  6.  Experience in setting up all the Upstream and Downstream Jobs in Jenkins. Worked with the Jenkins Pipelines for putting the all the workable tasks in the continuous deployment.
  7.  Developed build and deployment scripts using ANT as build tool and automated the build and deploy processes using Jenkins to move from one environment to other environments.
  8.  Implemented the zero - downtime deployment of the workflow of the process in WebLogic using python scripting as the language and shell script and automated it using Jenkins.
  9.  Used Git for source code version control and integrated with Jenkins for CI/CD pipeline, code quality tracking and user management with build tools ANT, Gradle and written pom.xml build script.
  10.  Wrote scripts using ANT tools and automated the build and deploy process using Jenkins to move from one environment to other environments. Also edited the existing ANT files in case of errors.
Devops Engineer

Hi- Tech Pharmaceuticals

Full time contract07/2016 - 01/2019
  1. Responsibilities:
  2.  Responsible for installation & configuration of Jenkins to support various Java builds and Jenkins plugins to automate continuous builds and publishing Docker Images to the Nexus Repository.
  3.  Created artifact documents through the source code and internal deployment in Nexus repository. Implemented Disaster recovery project on AWS using various DevOps automation for CI/CD.
  4.  Performed the automation deployments using AWS by creating the AWS IAMs and used the code pipeline plugin to integrate Jenkins with AWS and created the EC2 instances to provide the virtual servers.
  5.  Installed and configured Jenkins and created parameterized jobs to kick off builds for different environments. Managed the team's source repository through GIT and continuous integration system using Jenkins.
  6.  Implemented a Continuous Delivery pipeline with Docker, Jenkins, and GitHub. Responsible for supporting various Java builds and Jenkins plugins to automate continuous builds and publishing Docker Images to the Nexus Repository.
  7.  Used Git for source code version control and integrated with Jenkins for CI/CD pipeline, code quality tracking and user management with build tools Maven and written Maven pom.xml build script.
  8.  Designed and implemented GIT metadata including elements, labels, attributes trigger and hyperlinks and performed necessary day to day GIT support for different projects.

Collections

Outdefine logomark

Assessment badge

Completed assessment
03/27/2023
Skill Verification
Skills
AzureAWSGit: Version controlPythonJavascriptGoogle CloudJavaRubyCloud Computing
ExperienceSenior-level8+ years
Hourly rate$75/hr

astro circleWelcome to Outdefine

A free tokenized community dedicated to connecting global tech talent with remote job opportunities. Our platform is designed to help you connect, learn, and earn in the tech industry while providing the chance to collect DEF tokens. Join our vibrant community today and explore a world of possibilities for your tech career!

Join for free
astro-hello

Join a community, when you join Outdefine!Connect with 72,000 tech professionals globally.

A free tokenized community dedicated to connecting global tech talent with remote job opportunities. Our platform is designed to help you connect, learn, and earn in the tech industry while providing the chance to collect DEF tokens. Join our vibrant community today and explore a world of possibilities for your tech career!