Sai is a Senior DevOps Engineer with over 6 years of professional experience building and supporting Cloud-hosted environments. He has worked extensively with AWS and related services but also has supported Azure. He is strong with Shell Scripting and has worked extensively with Jenkins, Ansible, Kubernetes, and related tools. Sai is a passionate DevOps Engineer that stays current with new tools and trends.
Hire SaiProject Description: This Project Involves creating and maintaining infrastructure for a prediction model-based application.
AKS, AzureML, Data bricks, and ACR were the Key components in the Azure cloud that were used to build this application which was used to build an online B2B portal, to be available for the external and internal vendors to get products approved.
Environment: MVC, WEB API, PowerShell, VMware, SQL Server 2016, jQuery, Microsoft Azure Cloud, Azure
Active directory, Kafka, Terraform, XL Release, XL Deploy, Dependency Injection, WCF, OWASP Security, and risk mitigations
• Work with the operations team to implement and streamline the confidential Azure cloud environment using best practices for remediation.
• Gather requirements from the stakeholders about existing subscriptions to apply the security measures.
• Develop an automation system using PowerShell scripts and JSON templates to remediate the Azure. Services.
• Responsible for configuring Kafka consumer and producer metrics to visualize the Kafka system performance.
• Work as build and release engineer, deployed the services by VSTS pipeline. Created and maintained pipelines to manage the IAC for all applications.
• Monitor the quality of Automated build plans to help the delivery process to PROD and NON-PROD Environments.
• Worke on creating backup methodologies by Power Shell Scripts for Azure services like Azure SQL database, key vault, storage blobs, app services, etc.
• Use Splunk APM for log aggregation and analysis on different application servers and integrated Splunk with Sign-On authentication and Service now ticketing tool.
• Develop and Install servers through AzureResourse Manager (ARM) Templates or Azure Portal. Migrating On-premises virtual machines to ARM subscription with Azure Site Recovery.
• Develop and Design Azure DevOps pipeline to manage the resources across multiple subscriptions.
• Work on Terraform templates for provisioning virtual networks, subnets, VM scale sets, load balancers, and NAT rules and used terraform graphs to visualize the execution plan using the graph command.
• Configure BGP routes to enable ExpressRoute connections between on-premises data centers and Azure cloud.
• Work with version control, Build& Release Management, and deployments of the solutions to the Dev, QA & Prod environments leveraging Azure DevOps/ VSTS principles/ process (CI/CD) and toolsets of visual studio, AKS (Azure Kubernetes Service), application insights, Log analytics.
• Work Extensively on building and installing servers, creating multiple resources through Azure Resource Manager Templates (ARM) or Azure portal and provisioned them using Terraform templates. Also worked on Virtual networks, Azure custom security, endpoint security, and a firewall.
• Integrate Ansible with Jenkins to provide Automation, continuous integration & continuous deployment through Jenkins and wrote playbooks for automating Ansible servers using YAML scripting. Implemented Jenkins workflow and plugins for repeatable deployments of multi-tier applications, artifacts, and services to Docker.
Project Description: Worked as a part of Microsoft Dynamics 365 for Operations – Life cycle services Dynamics 365 for
Operations Trials – This feature helps new customers try Dynamics 365 for Operations for the first time. LCS is a Microsoft
Azure-based collaboration portal that provides a unifying, collaborative environment along with a set of regularly updated
services that help you manage the application lifecycle of your implementations.
Environment: Git, Jenkins, Docker, DTR, UCP, Kubernetes, Apache Mesos, DataStax Enterprise, Vault, Marathon-LB, OpenStack, Platform9, JIRA.
• Deployed Docker UCP, Mesos and Kubernetes orchestration clusters on various environments using Ansible and supported end to end container as a service platform.
• Onboarded multiple application teams to CaaS platform to deploy their container applications using CI/CD tools and helping them set up application DNS and monitoring solutions along with centralized logging system setup.
• Maintained Jenkins environment on docker and configured multiple plugins and included slaves on demand using yet-another-docker and scanning of images using Anchor plugins.
• Responsible for managing company’s internal private Docker Trusted Registry (DTR) having 3000+ of repos and around 1M images. Used IAM to create new accounts, roles and groups and polices and developed critical modules like generating amazon resource numbers and integration points with S3, Dynamo DB, RDS, Lambda and SQS Queue
• Created functions and assigned roles in AWS Lambda to run python scripts, and AWS Lambda using java to perform event driven processing.
• Delivered business value by automating the AEM server build-out saving hundreds of man-hours with the DevOps Automation Team
• Trained and lead a support team of 6 for 24×7 service for CaaS platforms and CI/CD tools during APJ hours.
• Architected and deployed Enterprise DataStax Cassandra environment on Apache Mesos
• Developed Ansible playbooks to automate the manual process of installing the clusters, services, automating the maintenance tasks like backups and restores.
• Provisioned and supported instances in OpenStack and Synergy clusters.
• Effectively communicated with various critical teams like Cyber Security, SAP and multiple applications teams, understanding their service requirements and onboarding to CaaS platform.
• Enabled SSL encryptions for the communications between the databases and applications
• Integrated Portworx and Istio service mesh in orchestration clusters.
• Troubleshoot build and release job failures, resolve, work with engineers on resolution.
• Configured Datastax Cassandra with OpsCenter, backup, disaster management and multiple datacentre deployments.
• Used Git as code repository, Jenkins for building and continuous deployments, Docker ucp and mesos for orchestration, Vault for secret management, DTR for private trusted image registry, ELK for monitoring, Ansible for config management, Datastax for DB.
Project Description: VPCx is the Central Public Cloud team of JnJ, catering to the needs of 125+ JnJ groups of companies in a self-service model. The project involves the development of in-house software XBOT, which ensures the compliance of JnJ IT
standards in the public cloud across 600+ AWS accounts.
Environment: DevOps, AWS Cloud, Python programming, Python Flask framework for web dev, Unix Shell Scripting, Angular
JS, Ajax, Jenkins, Git, Stash, Jira, AWS-Boto SDK, AWS-Cloud Formation, AWS-CLI, Jinja templating, YAML, Python unit testing, Mock, Moto (Mock for Boto), Amazon Dynamo DB, MySQL, Oracle, MSSQL server.
• Programming experience with S3, EC2, RDS, Redshift, Athena, Cloud Formation, and VPC Services.
• Developed frontend web app using Python Flask framework, JQuery, Ajax, and Angular JS.
• Executed deploying applications to public Cloud systems like AWS, Azure, and Google Cloud Platform.
• Created Cloud Formation scripts to create network setup, IAM Group, role, and user.
• Implemented Disaster Recovery high availability systems in AWS.
• Architect and design serverless application CI/CD by using AWS Serverless application model.
• Designed and developed a tool to monitor the server building process across AWS platforms. This tool is a web app tool so end users can log in and check the EC2 server build status.
• Reviewed the code developed by peer team members and followed the PEP8 Python standards.
• Created Pipeline in Jenkins to build and test Xbot code.
• Architect and designed serverless-based monitoring for each account resource by using AWS.
• Worked with automated testing tools such as Selenium and JMeter.
• Architect and designed source code framework helpful for customers to only focus on their development instead of deployment.
• Worked with various J&J customers to provide architectural solutions for their on-premise model.
• Created a web application for customers to view application data, alarms, tickets, and logins for their own accounts.
• Implemented Service Oriented Architecture (SOA and REST).
• Worked in Sqoop tool to import data from oracle to EMR HDFS
• Worked with AWS EMR Hive and created external tables, and hive partitions to analyze the data imported from AWS S3 and write it back to S3
• Worked with AWS Oracle, PostgreSQL, and MySQL databases to migrate data from on-premise J&J to AWS DB.
• Worked in AWS DMS (Data migration service) to migrate data from on-premise to AWS cloud.
Environment: DevOps, AWS Cloud, Python programming, Python Flask framework for web dev, Unix Shell Scripting, Angular JS, Ajax, Jenkins, Git, Stash, Jira, AWS-Boto SDK, AWS-Cloud Formation, AWS-CLI, Jinja templating, YAML, Python unit testing, Mock, Moto (Mock for Boto), Amazon Dynamo DB, MySQL, Oracle, MSSQL server.
Project Description: Worked with various healthcare vendors from Microsoft as part of this exercise we have integrated
Microsoft Teams in health care vendor application for the virtual visits
Environment: Git, Jenkins, Docker, DTR, UCP, Kubernetes, Apache Mesos, DataStax Enterprise Cassandra, Vault, Marathon-LB, OpenStack, Platform9, JIRA
• Implemented the end-to-end for this feature using .NET core 3.1
• Worked on azure active directory and azure key vault for Authentication and storing secrets safe
• Worked on authenticating the user who signs in from the client application.
• Worked on Application insights and log analytics to store the logs
• Worked on Graph API to create and delete Microsoft teams meetings for the virtual visit appointments
• Created build and release pipeline in Azure DevOps to deploy client apps as app services.
• Created a Logic app for workflows that runs on a scheduled basis.
• Written AZURE function for serverless DB and serverless AZURE API management works as API Gateway for API endpoints.
• Used react with typescript and fluent UI on the frontend for the appointment page
• Used Azure Communication services to send SMS to the patient
• Used Cosmos BB to store data in the database
• Used Azure Service Bus for queueing purposes.
• Used build and release pipeline in Azure DevOps to deploy Azure functions logic apps web apps