fbpx Harsh | DevReady

Let’s Get Started

Harsh
HireHarsh

Harsh

Vancouver, Canada --:--:--

Harsh has over 10 years of work experience focused on backend development (using cloud infra like AWS, Azure, Salesforce), ETLs (using entire AWS infra), and front end web app development (using React and deployed on-premise infra, Salesforce CRM community builder). He is highly adept at back-end Data Engineering, particularly recently focusing on Snowflake, SQL, Azure, and DBT. Harsh is highly advanced at cloud-based database work and is strong with Azure and Python. In addition to receiving his Masters’ Degree in Computer Science, he has also explored machine learning, modeling, deployment, and even completed the Deeplearning.AI Coursera certification.

 

Hire Harsh
Skills
Years
1
2
3
4
5
6
7
8
9
10+
AWS
Azure
Luminati
Active Directory
Redash
SQL
Java
NCUT
Snowflake
RNN
Apache
ChartIO
Greenplum
Sobel
Python
Glove
cTAKES
QuickSight
DBT
Word2Vec
NLP
Lambdas
Machine Learning
Flask
JWT
NodeJS
Datadog
Nginx
APIs
Ruby
Looker
SQGI
Spring Boot
Heroku
Jenkins
Kubernetes
ExpressJS
Tableau
LogicApp
React
SQL DB
PowerBI
Redux
MLFlow
Salesforce
Redshift
Developer Personality

Independent

Collaborative

Trailblazer

Conservative

Generalist

Specialist

Planner

Doer

Idealist

Pragmatist

Abstraction

Control

100
50
0
50
100
Feature Experience

Data Ingestion

Microservices

Data Architecture

Cloud Data Engineering

MODERATE
EXTENSIVE
EXPERT
Cultural Experience

E-commerce

Healthcare

Finance

SaaS

MODERATE
EXTENSIVE
EXPERT
Portfolio

Zocdoc

Senior Data Engineering

Categories

Work Experience : 2020-2022

Build ETL platform to streamline application data (produced in the staging environment) into an analytics platform called Snowflake. There internal stakeholders leverage the data to reflect user stories on Tableau or Looker.

● Developed ETL pipelines to streamline patient and provider event/monolith data by leveraging AWS – lambda/SQS, deployment
tool – TeamCity, configuration management – ansible, continuous integration – Jenkins and transformation engine – DBT
● Integrated 3rd party ETL solution (like Stitch to streamline Salesforce, google ads data), monitoring platform (like datadog to
check runtime health of the ETL pipelines and underlined hardware), visualization tool (like looker as self-hosted service)
● Built standalone tools for reprocessing missing data, synchronizing multiple warehouses, automating the dumping of sensitive
data securely into 3rd party storage environment
● Provided consulting to marketing, business operations/intelligence/data-science team by designing short term solutions using
ad-hoc queries on Snowflake/Redshift or long term solutions leveraging data bricks stacks like MLFlow or data bricks job

More

Cervello A.T. Kearney Inc.

Consultant Data Engineer

Categories

Work Experience : 2018-2020

● Designed scaled scraper solution for eCommerce client using Luminati proxy manager, pre-processed scraped data, modeled
NLP solution to extract key phrases with Microsoft cognitive API and developed classification model using RNN, Glove,
Word2Vec.
● Developed flask app using Nginx and WSGI as a load balancer to expose the scraper and NLP services, deployed services on Azure Kubernetes cluster, and scheduled the entire NLP solution through LogicApp workflow/Azure function.
● Ingested data in Power-BI to display product metrics generated by the NLP model which increased client revenue by 30%.
● Designed supply chain inventory optimization solution, orchestrated the pipeline using Azure Data Factory, resided
un-partitioned data on the blob and persisted partitioned data on Azure Data Lake, and, developed ETL through Databricks.
● Built Salesforce community for a financial institution using lightening web components to help target prospect users.

More

Geisinger.edu

Data Engineer

Work Experience : 2017-2018

● Designed application for healthcare institute on React/Redux to render patient historic visits and backend services on ExpressJS to expose the APIs written in Spring Boot.
● Designed user authentication process by configuring a hybrid mechanism with JWT and Apache leveraging Active Directory.
● Developed and optimized NLP data pipeline using the cTAKES library to predict diseases based on patient records.

More

CloudPlus Inc.

Data Engineer

Categories

Work Experience : 2016-2017

● Built AWS Lambda service on NodeJS to collect user survey data from the Delighted platform; the service was dockerized on an EBS container.
● Designed Ruby service to streamline weekly user activity data from Woopra, Delighted, CustomerIO, and Recurly. Automated the service using rake jobs and deployed on Heroku.
● Developed AWS Lambda service using NodeJS to identify daily malicious uploads on the server and used AWS Scanner to remove the identified malware..

More

ComScore

Data Engineer Intern

Categories

Work Experience : 2008-2009

● Automated manual extracts of monthly reports on Greenplum which improved performance by 90%
● Built a tool to auto-generate customized reports with digital marketing metrics to analyze the performance of client websites

Skills

More

Publications

Automated Segmentation in Ultrasound Images

Categories

Work Experience : 2008-2009

Preprocessed images to remove noise element using anisotropic diffusion technique; followed by sharpened image edges using unsharp filtering. Processed images by using NCUT and Sobel edge detection technique to segment images. Generated probability maps using the Floodfill technique and then scribbled images using the automatic scribbling technique.

Skills

More

Hire Harsh