fbpx Rama | DevReady

Let’s Get Started

Rama
HireRama

Rama

Waterloo, Ontario, Canada --:--:--

Rama is a Senior Cloud Architect with over 20 years of professional experience in the Information Technology field.  He has expertise in designing, analyzing, integrating different systems, and architecting enterprise data warehouse solutions. He has worked heavily on various technologies and platforms as a Developer, SME, Data Warehouse Architect, Big Data Architect, and Cloud Architect in the Financial Industry, Investment Banking, Public Sector, Telecommunications, Retail, and Pharma. He is solid in Hadoop, YARN, HIVE, Pig, SQOOP, FLUME, and migrating and implementing Big Data cloud solutions using AWS/Google. Rama has always been intrigued by new technologies and is driven to pursue their integration with each other which is a key to building modern data analytics platforms.

 

Hire Rama
Skills
Years
1
2
3
4
5
6
7
8
9
10+
Airflow
Azure
VPC
Bigdata
AWS
QlikView
SQL Developer
SQL Server
Datawarehouse
TOAD
SharePoint
EMR
Cloud Technologies
Redshift
Python
OBIEE
SQL
Aurora
Jenkins
Data Lakes
AWS S3
Netezza
Unit Testing
MS Office Suite
RDBMS
DynamoDB
GitHub
Route 53
RDS
Hadoop
Unix Shell
Informatica Designer
Yarn
Data Bricks
CloudWatch
EC2
Sqoop
Control-M
Informatica PowerCenter
T-SQL
CloudFormation
Lambda
Hive
Tableau
ETL
Virtual Machines
Power BI
Athena
Enterprise Edition
Visual Source Safe
Oracle
Flume
Snowflake
QuickSight
Windows XP
DB2
Storage Gateway
Sybase
AWS Transfer Family
Flat Files
Glue
PL/SQL
Iway ETL
JavaScript
Machine Learning
AI
Talend
Spark
Kafka
HBase
Developer Personality

Independent

Collaborative

Trailblazer

Conservative

Generalist

Specialist

Planner

Doer

Idealist

Pragmatist

Abstraction

Control

100
50
0
50
100
Feature Experience

Data Ingestion / Integration

Cloud Migration

Data Lakes / Architecture

Data Analytics / Visualization

MODERATE
EXTENSIVE
EXPERT
Cultural Experience

Agile - Scrum

Financial/Banking

Telecommunications

Government

MODERATE
EXTENSIVE
EXPERT
Portfolio

Tek Systems

Enterprise Cloud Architect

Categories

Work Experience : 2021-2022

Designed and led the architecture to migrate legacy oracle to Snowflake Cloud Datawarehouse hosted on AWS Platform for Caterpillar Insurance.

  • Led the design and development of cloud infrastructure and cloud services
  • Participated in proofs of concept/technology as needed to ensure optimized architecture and service selections
  • Worked closely with the on-premise Enterprise Architecture team to ensure alignment during migrations
  • Advised application architects and developers on the design of scalable, highly available, secure application solutions that leverage cloud services
  • Continuously managed, monitored, and updated architecture models as business needs evolve and additional cloud services become available
  • Provided cloud technical leadership and brought innovative cloud technologies to the enterprise
  • Utilized cost/benefit models to map architecture choices to business outcomes and associated KPIs
  • Supported the transfer of knowledge to the broader IT organization
  • Translated organizational level business requirements and use cases into cloud capabilities
  • Designed solution-independent architecture models that mapped business services to cloud application stacks
  • Maintained technical-level relationships with third-party product and platform partners
  • Participated in optimizing solution architectures to manage costs and identify cost control mechanisms
  • Developed and managed enterprise cloud governance KPIs and maintain reference architecture model
More

NucleusTeq

Enterprise Cloud Architect

Categories

Work Experience : 2019-2021

Designed and led the architecture to migrate legacy Netezza to Snowflake Cloud Data warehouse hosted on Azure Platform for PetSmart Client.

  • Led a team of architects both onshore and offshore and heading modern data analytics platforms. Streamline Enterprise Big Data Platform in Azure Cloud.
  • Led migration of Virtual Machines to Azure Virtual Machines for multiple global business units.
  • Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS VMs and PaaS role instances for refactored applications and databases.
  • Created recommendations on how to duplicate a subset of on premise machines to the Azure Infrastructure as a Service (IAAS) offering which will be used for disaster recovery. This analysis included the specifics to synchronize on premise data with SQL Server and SharePoint instances hosted in VMs.
  • Performed client acceptance and prototyping using Azure Compute and SQL Azure instances.
  • Worked on Data Factory Editor, to create linked services, tables, data sets, and pipelines by specifying JSON definitions for these artifacts.
  • Built ETL Integration pipelines using Informatica to consume data from Azure Data Lake and load into Snowflake.
  • Extensive experience with CI/CD (DevOps) pipelines and concepts, including Azure Resource Management. Extensively used Jenkins, GitHub, Azure native agile tools like Azure Boards, Pipelines, Repos and Artifacts.
  • Extensively used T-SQL/ANSII SQLand Azure Synapse.
  • Built a solution using Azure managed services StorSimple and Blob storage to archive on-premises data to cloud.
  • Configured VMs in availability sets using Azure portal to provide resiliency for IaaS based solution and scale sets using Azure Resource Manager to manage network traffic
  • Migrated Netezza based Enterprise Datawarehouse to Snowflake Data warehouse.
  • Performed PoC to spin up EC2 instances using Terraform.

Environment: Informatica Power Center 10.x, SQL Data Warehouse, Power BI, ADLS,  Azure HDInsight, Data bricks, Python

More

Wipro Limited (Client: BNSF, Citi Bank, AT&T)

Cloud Migration Architect

Categories

Work Experience : 2018-2019

Designed AWS Architectures to build S3-based data lakes  Serverless architectures using Lambda, Glue, Athena & Quicksight to build data analytics and visualizations.

  • Architecting and designing Abinitio to Spark migration frameworks and working on new initiatives to migrate on-premise Data Warehouse solutions to AWS Redshift/Snowflake.
  • Designed and set up Data Lake in  AWS S3 using Lake Formation Service and load data using Glue Jobs/Pyspark scripts (the file format used Parquet, implemented column level access).
  • Designed end-to-end security of the AWS ecosystem by introducing key based key-based encryption, security groups, column level access, well-tuned roles, and bucket policies.
  • Built serverless architecture using python lambda functions to transform and enrich the data.
  • Expertise in Redshift features like Work Load Management, compression, and Query Optimization Load data to Redshift using Pyspark scripts/Glue Jobs.
  • Designed and managed the cloud infrastructures using AWS Services which includes EC2, S3, RDS, VPC, Route53, Cloud Watch, Cloud Trail, Cloud Formation, and IAM which has allowed automated operations.
  • Deployed and Automated an EC2 instance using CloudFormation template integrated with the application software installation and configuration with Userdata.
  • Provided strategic guidance regarding business, product, and technical challenges in an enterprise environment.
  • Migrated load processes from Appliance DWH system (Netezza) to AWS Redshift.

Environment: Abinitio, AWS S3, Snowflake, Python, Redshift, Power BI, ADLS,  EMR, Glue, Athena, QuickSight

More

Scotia Bank

Big Data Solution Architect

Work Experience : 2017-2018

Worked on Net Promoter System that is used to collect B2B information and based on the information provided, external system will engage customer survey for feedback on their latest transactions which can be at branch level, Wired or Wireless.

Responsibilities

  • Designed Talend processes to extract, transform and load the source data to the target data warehouse and Data lake.
  • Infrastructure development on AWS using various services like EC2, S3, RDS.
  • Perform data migration to AWS cloud.
  • Developed Informatica mappings to consume data from AWS S3 and transform data to load into AWS Redshift.
  • Worked on a framework where multiple sources ingesting their data in to Landing Zone within an Enterprise Data Lake to stage and consume the data as per business needs.
  • Configure and tune production and development Hadoop environments with the various intermixing Hadoop components
  • Involved in designing a system, that can source data from different systems and transform data and integrate data in and out of Horton Works Big Data echo system.
  • Developed high-performance data processing pipelines in Big Data platform.
  • Coordinated with Project manager to estimate cost and resources for completion of ETL/BI Projects.
  • Responsible for implementing a Cloudera Hadoop deployments including deploying, configuring, and managing Hive, Spark, and Impala on AWS cloud.
  • Responsible designing and implementing Data Lake.
  • Created architecture and detailed design diagrams and documentation in cooperation with BI and ETL architects.
  • Involved in engaging communication with technical and non technical audience and with different levels of stakeholders.

Environment: Informatica PowerCenter, Iway ETL, AWS EC2,S3, RDS, Talend, Horton Works, Hadoop, HIVE, SQOOP, FLUME, Spark, SQL Developer, Kafka, Control-M, Unix, HBase.

More

Rogers

Senior Solution Designer

Categories

Work Experience : 2016-2017

Solaris project in rogers intended to migrate different streams from legacy systems to the Hadoop eco-system.

Responsibilities

  • Analyzed the Specifications and identified the source data needs to be moved to the data warehouse.
  • Worked with Source Analyzer, Data Warehouse Designer, Repository Manager, Workflow monitor, Mapping Designer, Mapplet, and Transformation Developer in Informatica designer.
  • ETL mappings were developed to perform tasks like validating file formats, business rules, database rules, and statistical operations.
  • Used Informatica Designer to create Reusable transformations to be used in Informatica mappings and Mapplets.
  • Transforming data from source to Target Table using Informatica.
  • Migrating the data from Legacy Oracle to AWS Redshift

Environment: Informatica AWS, Power Center 9.6.1, Informatica BDM 10.1.0, Oracle, Informatica IDQ 9. x, Control-M, Unix, Windows 7 Enterprise Edition, Tableau.

More

Ministry of Transportation

ETL Systems / Data Architect Consultant

Categories

Work Experience : 2016

RUSMOD (ROAD USER SAFETY MODERNIZATION). This is a project to extract, load and transform data from different sources to Enterprise Data Warehouse (EDW) to perform financial analysis on different modules like Beginners License for Individuals and certified training institutes.

Responsibilities

  • Gathered requirements from business analysts and business users and prepared project plans to deliver data in time.
  • Created complex mappings and configured workflows, work lets & sessions to transport data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Used mapplets and reusable transformations to prevent redundancy of transformation usage and maintainability.
  • Implemented performance tuning logic on targets, sources, mappings, and sessions to provide maximum efficiency and performance.
  • Configured pushdown optimization to improve the performance.
  • Created SQL scripts to compare data before and after the Informatica upgrade.
  • Created various UNIX shell scripts along with awk commands to automate file archival, FTP, and analyze data issues at the file level.
  • Worked on a project for implementing data quality checks to verify business rules utilizing PL/SQL, and shell scripts.
  • Extensively used Control-m scheduler for scheduling of the UNIX shell script jobs, and Informatica Weekly jobs.
  • Maintain documentation for corporate Data Dictionary with attributes, table names, and, constraints.

Environment: Informatica Power Center 9. x, Oracle 10g/ 9i, SAS Data Integration, Oracle EDQ 11g R1, OBIEE 11g, MS-SQL Server, SQL Developer, Unix Shell, Windows XP and MS Office Suite, Control-M Scheduler, Tableau

 

More

Federated Co-operatives Limited (FCL)

Lead Data Developer / Consultant

Categories

Work Experience : 2015-2016

This project was an end-to-end implementation of IBM’s Industry Standard model for Retail Data Warehouses, RDW. The 3 primary duties of this project were Data Architecture and POC on Big Data Technologies.

Responsibilities

  • Research new technologies and approaches for presenting key business insights by analyzing Big Data
  • Define and lead the realization of the data strategy roadmap for our Collaboration portfolio; including data modeling, implementation, and, data management
  • Assess and implement the proper role for machine learning our Big Data solution
  • Collaborate with customer teams to formulate the problem, recommend a solution approach and design a data architecture
  • Develop conceptual, logical, and physical design for various data types and large volumes.
  • Define and develop guidelines, standards, and processes to ensure the highest data quality and integrity in the data stores residing on the data lake
  • Research and suggest new toolsets/methods to improve data ingestion, storage, and data access in the analytics platform

Environment: Informatica Power Center 9.1.0, SSIS, SQL Server DB, Netezza, Unix Shell, SSIS, QlikView, Horton Works Hadoop, SQOOP, FLUME, HIVE, Tableau

More

Roche Canada

Data Warehouse Architect

Categories

Work Experience : 2015

Roche Canada is one of the leading swiss based Pharmaceutical companies focusing on the Canadian Territory. Existing projects performed ETL operations using QlikView and decided to use Informatica as their ETL tool instead of Qlikview. Worked as ETL Systems Architect to define process and guidelines to transform ETL logic from QlikView to Informatica PowerCenter.

Responsibilities

  • Analyzed the existing systems and the business user requirements for tactical and strategic needs.
  • Created the Dimensional data model (STAR Schema) and enhanced it to meet the growing requirements due to various corporate groups accessing the data.
  • Created the ETL architecture and provided strategy related to data cleansing, data quality, and data consolidation
  • Interacted with the business users on regular basis to consolidate and analyze the requirements and present them the design results.
  • Involved in gathering and analyzing the requirements and preparing business rules.
  • Lead a team of five developers onsite and coordinated activities with the offshore team in California.
  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logic while coding a mapping.
  • Created Enterprise Data Warehouse architecture and standards, plus trained two architects in dimensional modeling and data quality; created shared Master Data sets and data models.
  • Mentored DA’s on dimensional models, SCD’s, MDM, and standards and resolved multiple issues
  • Worked with business users to analyze requirements, standards, presentations, and, documents

Environment: Informatica Power Center 9.1.0, SSIS, SQL Developer, Unix Shell, Windows XP and MS Visual Source Safe, QlikView

More

Ministry of Transportation

Informatica ETL Consultant

Work Experience : 2014-2015

Roche Canada is one of the leading swiss based Pharmaceutical companies focusing on the Canadian Territory. Existing projects performed ETL operations using QlikView and decided to use Informatica as their ETL tool instead of QlikView. Worked as ETL Systems Architect to define process and guidelines to transform ETL logic from QlikView to Informatica PowerCenter.

Responsibilities

  • Analyzed the existing systems and the business user requirements for tactical and strategic needs.
  • Created the Dimensional data model (STAR Schema) and enhanced it to meet the growing requirements due to various corporate groups accessing the data.
  • Created the ETL architecture and provided strategy related to data cleansing, data quality, and data consolidation
  • Interacted with the business users on regular basis to consolidate and analyze the requirements and present them the design results.
  • Involved in gathering and analyzing the requirements and preparing business rules.
  • Lead a team of five developers onsite and coordinated activities with the offshore team in California.
  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logic while coding a mapping.
  • Created Enterprise Data Warehouse architecture and standards, plus trained two architects in dimensional modeling and data quality; created shared Master Data sets and data models.
  • Mentored DA’s on dimensional models, SCD’s, MDM, standardsand  – resolved multiple issues
  • Worked with business users to analyze requirements, standards, presentations and, documents

Environment: Informatica Power Center 9.1.0, SSIS, SQL Developer, Unix Shell, Windows XP and MS Visual Source Safe, QlikView

More

Bell Canada

Informatica Lead Consultant

Categories

Work Experience : 2012-2014

Bell Canada is one of the leading telecommunications companies in Canada. This project deals with hardware performance data for Wireless devices. The goal is to extract, load, and transform data into an Oracle database and generate reports for higher management.

Responsibilities

  • Involved in analysis of end-user requirements and business rules based on given documentation and worked closely with tech leads and Business analysts in understanding the current system.
  • Assisted Business Analyst in documenting business requirements, technical specifications, and implementation of various ETL standards in the mappings
  • Analyzed the business requirements and was involved in writing Test Plans and Test Cases.
  • Involved in the design of Data-warehouse using Star-Schema methodology and converted data from various sources to oracle tables
  • Reviewed and created new ETL routines and deployed into the Test and Production environment.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, and sessions to provide maximum efficiency and performance.
  • Provided support with extended hours including after-hours and weekends to maintain the stability of application and business.
  • Used Informatica Power Center Workflow manager to create sessions, workflows, and batches to run with the logic embedded in the mappings.

Environment: Informatica Power Center 9.1.0, Oracle 10g/ 9i, Oracle EDQ 11g R1, OBIEE 11g, MS-SQL Server, SQL Developer, Unix Shell, Windows XP, and MS Office Suite, Control-M Scheduler, Tableau, MS Visual Source Safe

More

Credit Suisse

ETL Systems Lead/Analyst

Categories

Work Experience : 2006-2012

Credit Suisse is a Swiss-based Investment bank providing advisory to many financial institutions across the globe. It has its operations in 32 countries.

Responsibilities:

  • Responsible for requirement gathering, functional, and technical specs for mapping.
  • Participated in status review weekly team meetings.
  • Analyzed the Specifications and identified the source data that needs to be moved to the data warehouse.
  • Worked with Source Analyzer, Data Warehouse Designer, Repository Manager, Workflow monitor, Mapping Designer, Mapplet, and Transformation Developer in Informatica designer.
  • ETL mappings were developed to perform tasks like validating file formats, business rules, database rules, and, statistical operations.
  • Used Informatica Designer to create Reusable transformations to be used in Informatica mappings and Mapplets.
  • Provided 24×7 support for production operations (incident break/fix, change, service request, project, and databases)
  • Handled tickets using Incident management through the peregrine service center.
  • Ensured all production changes complied with change management policies and procedures.
  • Developed Troubleshooting Documents that listed all the support-related activities and day-to-day issues encountered every day.
  • Developed end-to-end ETL process to extract, transform and load data into the datamart and downstream feeds using Informatica tools.
  • Migrated Mappings and Folders from PowerCenter 8 to PowerCenter 8.6.
  • Extensively used Informatica to load data to Flat Files and Oracle.
  • Developed source definitions, and target definitions to extract data from flat files, and relational sources to the Data warehouse.
  • Created and run Workflows which includes sessions, command tasks, email tasks, decision tasks Event raise, and Event wait tasks and timer in Informatica Workflow Manager.
  • Involved in Error handling (Ignore, rejecting bad records to a flat file, loading the records, and reviewing them)
  • Used workflow Manager for Creating, Validating, Testing, and Running the sequential and concurrent Batches and Sessions, and scheduled them to run at a specified time.
  • Tested the target data against source system tables by writing the PL/SQL Procedures.
  • Created database objects like tables, views, Materialized views, procedures, and packages using Oracle tools like PL/SQL.
  • Scheduled workflows using Control-m Job scheduling.
  • Used shell scripts for automating the execution of mappings.
  • Have done production support and UAT Support.
  • Created test cases and was involved in Unit and System testing for Mappings and Stored Procedures.
  • Coordinated with offshore team in India.
  • Created and executed test cases for Unit and System Testing.
  • Troubleshooting and providing both immediate and long-term resolutions.
More

Hire Rama