fbpx Ravi | DevReady

Let’s Get Started

Ravi
HireRavi

Ravi

Reading, England --:--:--

Ravi is a BI & Data Analytics Consultant with 14+ years of experience in Architecture, Data Analysis, Business Analysis, Designing and Developing successful Business Intelligence and Analytics, Cloud Data Warehouse and Data Visualization projects working for diverse data sources and data platforms ranging from Enterprise Data Warehouse, Big Data and Cloud. Worked for large Business Intelligence Solutions, Data Warehouses, Data Visualizations, Data Pipelines and Software Applications in the Media, Entertainment & Gaming, Retail, Banking, Insurance and Pharma Sectors. Hands-on Business and Data Architecture, Data Analysis, Data Profiling, Data Migration, Data Integration and Metadata Management Services. Passionate about working with data and expertise in turning the data into actionable insights.

Ravi is a Certified AWS Solution Architect, Certified Informatica Developer, Hands-On Snowflake – Web UI Essentials, Microsoft Certified – Azure Fundamentals, and SnowPro Core Certification

Hire Ravi
Skills
Years
1
2
3
4
5
6
7
8
9
10+
Azure
Aleteryx
SAS ETL
Dashboards
SharePoint Server
Power BI
Redshift
Snowflake
SQL Server
Wherescape
JSON
Tableau
Oracle
Google Big Query
Oracle
Informatica Power Center
Python
AWS
JIRA
Confluence
Developer Personality

Independent

Collaborative

Trailblazer

Conservative

Generalist

Specialist

Planner

Doer

Idealist

Pragmatist

Abstraction

Control

100
50
0
50
100
Feature Experience

Databases

Snowflake

Cloud Platforms

PowerBI

MODERATE
EXTENSIVE
EXPERT
Cultural Experience

Agile - Scrum

Gaming

Consulting

Media

MODERATE
EXTENSIVE
EXPERT
Portfolio

Allianz Insurance (EU)

BI & Data Engineer / Architect

Work Experience : 2021

Program: SAS DECOM and Data Migration

• Worked on E2E decommission of SAS application and Data migration form SQL Server, Exadata onto Snowflake.
• Worked with product owner to model Snowflake to become the data lake/ data nexus.
• As part of data onboarding, worked with cross functional teams to setup, Integrate snowflake with cloud services like Azure.
• Created RBAC on Databases, Warehouses, created resource monitors for warehouse and storage management.
• Implemented dynamic masking policies on the sensitive data.
• Created automated process to extract the data through API, External Storage and On-Premises databases.
• As part of data onboarding, worked on designing and developing snowflake data pipelines and used Azure services like Blob and Azure data factory.
• Implemented solutions using snowflake features Snowpipe and Tasks. Integrated Snowpipe with notification services to automate the ELT process.
• Worked on data analysis of different source to understand the data, data grain in different tables to support the data model and data exchange views.
• Worked with source system owners to define the source data extraction strategy and understanding the source data and enrichment rules.
• Worked on the enrichment views and integrating secure views with different BI tools.
• Worked on historization and change capture of the data using Streams.
• Used Snowflake procedures to automate the data calculation process.
• Created the technical documentation and slides for solutions.
• Presented the solutions developed to the Architecture and approval forums to get the approvals for Dev & deploying in prod.
• Worked closely with program manager supporting to achieve the program goals.
• Followed Agile process, used JIRA to track all the tasks.

Technologies: Snowflake, Azure, Sql Server, Data vault 2.0, Wherescape, Power BI, JIRA, Confluence.

Features

More

Billigence

BI & Data Engineer / Architect

Categories

    Work Experience : 2020-2021

    Project: Reporting Automation (Financial)
    • Worked on automation of the financial reports.
    • Got the requirements and KT from financial users on the reports generated manually with manual calculation, pivots, and aggregations.
    • Automated the manual process by developing Alteryx workflows to generate excel outputs with all the manual calculations and styling.
    • Extracted the data from financial applications using Alteryx.
    • User Alteryx transformations like Multi-Row Formula, Join Multiple, Formula, Union, text to columns etc.
    • Extracted data from files, transformed in using Alteryx functions and then loaded back the data into formatted files.
    • Tested the Alteryx workflow by editing the data in source files.
    Technologies: Alteryx.

    Skills

    More

    News UK

    BI & Data Engineer / Architect

    Work Experience : 2019-2020

    Project: UCP (Unified Customer Profile)
    • Worked with 3rd party marketing/campaign team.
    • Worked with data users and key stakeholders to gather the requirements for long-term objectives.
    • Understood the requirements, analysed the data quality, data availability and data integrity to fulfil the user requirements.
    • Extensively worked on campaign analytics and designed and developed data model for campaign analytics.
    • Presented various data points on the procedure of data collection from 3rd party systems to achieve the data correctness in the presentation layer.
    • Lead the global development team to build/migrate the unified customer DW, which was spread across different products.
    • Designed the extracting audit data from GIT into JSON file and processed the data into DW.
    • Worked closely with development team building data pipelines in Redshift and Google BigQuery to extract and load the data into various dimensions and facts.
    • Heavy focus on building analytics on the clickstream data and the captured events from the streams and for different products.
    • Created Tableau dashboards using stack bars, bar graphs, bullet charts, Gantt charts demonstrating key information for decision making.
    • Worked extensively with Calculations, Actions, and Parameters. Created Trend Lines, Groups, hierarchies and sets to create detail level summary reports.
    • Created dashboards in Tableau Desktop and published onto Tableau Server.
    • Created Drill down analysis for dashboards.
    • Created conditional filters to filter the data on dashboard and creating Parameters for desired functionality.
    • Involved in resolving reporting and design issues throughout the reporting life cycle.
    • Published completed dashboards on to server and scheduled extract refresh.
    • Strong ability in developing SQL queries to fulfil data and reporting requirements including identifying the tables and columns.
    • Monitored the project processes, making periodic changes and guaranteeing on-time delivery.
    • Documented a whole process of working with Tableau Desktop and evaluating Business Requirements.

    Technologies: SAS ETL, Python, AWS S3(file storage & Transfer), JSON, Redshift, Google Big Query, Tableau, JIRA, Confluence.

    More

    Electronic Arts EA

    Snowflake DW & BI Architect

    Work Experience : 2011-2019

    Worked as Snowflake Cloud DW Architect, Data Architect, Lead BI / Data Engineer (SNOWFLAKE), Senior Business Intelligence Engineer, Delivery Manager.

    Projects: Payroll Data consolidation, Mobile Sales Tax reporting BI Solution, Project Veracruz, Global Sales Warehouse Implementation (Snowflake Cloud Data Warehouse Implementation).

    Responsibilities:
    • Estimated, Designed and developed and delivered On-Premise Data Warehouse and reporting solution onto cloud (AWS & Snowflake Cloud Data Warehouse).
    • Integrated Snowflake and AWS VPC for private link access.
    • Implemented Snowflake SSO integration for secured access.
    • Determines the optimal approach for obtaining data from diverse source system platforms and moving it to the data analytics environment.
    • Assists and coaches ETL specialists, Data Engineers during solution implementation.
    • Works closely with the Data Warehouse development team and Business representatives to analyze needs and develop and deliver the architectural requirements to ensure scalability and long-term usability and meaningful insights from the data.
    • Estimate system capacity to meet near- and long-term processing and business requirements.
    • Developed data movement system architecture, including Data Quality rule engines, Metadata Management and Data Migration solutions using Snowflake Cloud Data Warehouse.
    • Defined and implemented processes to reinforce the quality, transparency, and automation in ELT processes to adapt the new technology.
    • Conduct impact assessment and determine size of effort based on requirements
    • Design the E2E data architecture and data movement process by utilizing the snowflake cloud data warehouse capabilities like transient and temp tables, clone and swap, time travel and dynamic column creation.
    • Designed and developed and implemented solution for building Global Sales warehouse on Snowflake cloud data warehouse.
    • Ensured effective access security by creating robust method using object level roles.
    • Ensured data security at BI level by taking Regional and Transactional accounts and business units into consideration.
    • Create alerts to identify the long running queries, compute and storage usage and integrated alerts to slack channel.
    • Worked with snowflake product support team and in house network team to implement SSO for Snowflake access.
    • Ensured quality through code reviews and knowledge sharing.
    • Developed story telling dashboards in Tableau Desktop and published them on to Tableau Server which allowed end users to understand the data on the fly with the usage of quick filters for on demand needed information.
    • Scheduled data refresh on Tableau Server for daily/weekly and monthly increments based on business change to ensure that the views and dashboards were displaying the changed data accurately.
    • Tested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data.
    • Participated in meetings, reviews, and user group discussions as well as communicated with stakeholders and business groups.
    Environment: Informatica Power Center, Snowflake Cloud Data Warehouse, S3, Oracle, Python, Tableau, JIRA, Confluence, Agile / Scrum.

    More

    Parexel, Hyderabad, India

    BI Delivery Lead

    Categories

    Work Experience : 2011

    Organization: GSS Info Tech, India
    Client: Parexel, India.
    BI Delivery Lead
    Project: CMDM Framework

    Responsibilities:
    • Worked on setting up delivery team at offshore.
    • Worked on setting up process to deliver tasks from offshore.
    • Worked on Requirement gathering and requirement Analysis
    • Interacting with the onsite business and technical team on various forums to discuss the delivery model and status of the tasks.
    • Giving the inputs & sharing the knowledge with other team members.
    • Create mappings employing various transformations, filters, joiners, lookup, SQL overrides etc to load data from multiple databases into Warehouses.
    Environment: Informatica Power Center9, Informatica B2B Oracle, Agile / Scrum.

    More

    Hire Ravi