fbpx Harsh | DevReady

Let’s Get Started

Harsh
HireHarsh

Harsh

Vancouver, Canada --:--:--

Harsh has over 10 years of work experience focused on backend development (using cloud infra like AWS, Azure, Salesforce), ETLs (using entire AWS infra), and front end web app development (using React and deployed on-premise infra, Salesforce CRM community builder). He is highly adept at back-end Data Engineering, particularly recently focusing on Snowflake, SQL, Azure, and DBT. Harsh is highly advanced at cloud-based database work and is strong with Azure and Python. In addition to receiving his Masters’ Degree in Computer Science, he has also explored machine learning, modeling, deployment, and even completed the Deeplearning.AI Coursera certification.

 

Hire Harsh
Skills
Years
1
2
3
4
5
6
7
8
9
10+
AWS
Azure
Flask
Active Directory
ChartIO
SQL
Java
Snowflake
Nginx
Apache
Lambdas
Python
Kubernetes
NLP
NodeJS
DBT
LogicApp
JWT
Ruby
Machine Learning
PowerBI
APIs
Heroku
Datadog
Salesforce
Spring Boot
Looker
ExpressJS
Jenkins
React
Tableau
Redux
SQL DB
MLFlow
Redshift
Developer Personality

Independent

Collaborative

Trailblazer

Conservative

Generalist

Specialist

Planner

Doer

Idealist

Pragmatist

Abstraction

Control

100
50
0
50
100
Feature Experience

Data Ingestion

Microservices

Data Architecture

Cloud Data Engineering

MODERATE
EXTENSIVE
EXPERT
Cultural Experience

E-commerce

Healthcare

Finance

SaaS

MODERATE
EXTENSIVE
EXPERT
Portfolio

Zocdoc

Senior Data Engineering

Categories

Work Experience : 2020-2022

Build ETL platform to streamline application data (produced in the staging environment) into an analytics platform called Snowflake. Their internal stakeholders leverage the data to reflect user stories on Tableau or Looker.

  • Developed ETL pipelines to streamline patient and provider event/monolith data by leveraging AWS – lambda/SQS, deployment
    tool – TeamCity, configuration management – ansible, continuous integration – Jenkins and transformation engine – DBT
  • Integrated 3rd party ETL solution (like Stitch to streamline Salesforce, google ads data), monitoring platform (like Datadog to
    check runtime health of the ETL pipelines and underlined hardware), visualization tool (like looker as a self-hosted service)
  • Built standalone tools for reprocessing missing data, synchronizing multiple warehouses, and automating the dumping of sensitive
    data securely into 3rd party storage environment
  • Provided consulting to marketing, business operations/intelligence/data-science team by designing short-term solutions using
    ad-hoc queries on Snowflake/Redshift or long-term solutions leveraging data bricks stack like MLFlow or data bricks job
More

Cervello A.T. Kearney Inc.

Consultant Data Engineer

Categories

Work Experience : 2018-2020

Designed a scaled scraper solution for an eCommerce client using Luminati proxy manager, pre-processed scraped data, modeled NLP solution to extract key phrases with Microsoft cognitive API, and developed a classification model using RNN, Glove, and Word2Vec. Built a Salesforce community for a financial institution using lightning web components to help target prospect users.

  • Developed flask app using Nginx and WSGI as a load balancer to expose the scraper and NLP services, deployed services on Azure  Kubernetes cluster, and scheduled the entire NLP solution through LogicApp workflow/Azure function.
  • Ingested data in Power-BI to display product metrics generated by the NLP model which increased client revenue by 30%.
  • Designed supply chain inventory optimization solution, orchestrated the pipeline using Azure Data Factory, resided un-partitioned data on the blob and persisted partitioned data on Azure Data Lake, and, developed ETL through Databricks.
More

Geisinger.edu

Data Engineer

Work Experience : 2017-2018

Designed an application for a healthcare institute on React/Redux to render patient historic visits and backend services on ExpressJS to expose the APIs written in Spring Boot.

  • Designed user authentication process by configuring a hybrid mechanism with JWT and Apache leveraging Active Directory.
  • Developed and optimized the NLP data pipeline using the cTAKES library to predict diseases based on patient records.
More

CloudPlus Inc.

Data Engineer

Categories

Work Experience : 2016-2017

Built AWS Lambda service on NodeJS to collect user survey data from the Delighted platform; the service was Dockerized on an EBS container.

  • Designed Ruby service to streamline weekly user activity data from Woopra, Delighted, CustomerIO, and Recurly. Automated the service using rake jobs and deployed it on Heroku.
  • Developed AWS Lambda service using NodeJS to identify daily malicious uploads on the server and used AWS Scanner to remove the identified malware.
More

Previous Roles

Categories

Work Experience : 2011-2015
  • Data Engineer Intern at comScore, Inc. (2015)
  • Backend Engineer at Headstrong (2011-2014)

Features

More

Hire Harsh