Mukkanti is a Google Cloud Certified Data Engineer with a demonstrated 12-year history of working in the information technology and services industry. He has strong hands-on experience in on-premises (Cloudera and Hortonworks) and cloud technologies (GCP, Azure, AWS). He has worked on large-scale batch and real-time data pipelines. He is adept at implementing and designing various big data solutions using Hadoop, HDFS, MapReduce, Hive, Spark, Kafka, Cassandra, HBase, Python, and Scala to solve current big data problems in Retail, Banking, Telecommunication, Insurance, Entertainment, Security, etc.
Hire MukkanteswaraWork with on-site and internal teams to gather data requirements for a project that evaluates promotions and offers effectiveness for a major Canadian supermarket chain. Design and develop data pipelines to cater to the same and transform the legacy data into a Standard Data Model.
Technologies: TDD, EAD, GCP, Spark, Apache Beam, BigQuery, Cloud Function, Python, API, KMS, Airflow, SourceRepositories
Work as a Senior Data Engineer to implement the eSentire AWS Anomaly Detection Data platform. eSentire is a Managed Detection and Response (MDR) service provider, that keeps organizations safe from constantly evolving cyber-attacks.
Technologies: AWS, Kinesis Firehose, S3, Glue, Redshift
Work with internal and client team members to implement the Azure Analytical Data platform. AEG was an American worldwide sporting and music entertainment presenter who wanted to leverage analytics with Azure Platform for insights about customers, ticket sales, marketing, etc.
Technologies: Git, Azure Data Lakes, NIFI, Spark, Data Bricks, Python, Airflow
Led a team of 3 developers at the client location to help deliver near-real-time data for a dynamic dashboard used by Rogers’ senior management to get an instant insight into the overall status of wireless and cable operations, network stability, department-wise sales targets, and customer query categories so they could make proactive, preemptive decisions.
Technologies: PySpark, Python, Batch, SQL, Data Lakes, Hadoop Data Lakes, Spark, Hive, Unix Shell
Led a team of 5 members to deliver AML(Anti Money Laundering) use cases. Hadoop Analytical Platform was built to capture various transactions into the system and validates the data by running various scenarios (like a customer cannot deposit more than 10K dollars per day into his account etc.) to find the fraudulent transactions. If any of the transactions were faulty, an alert would be generated and the concerned team would start looking into the details of the transaction and see whether it was a legitimate transaction or would file a case against the customer.
Technologies: Kafka, Spark, Hbase, Cassandra NoSQL, SQL, Bucketing, Hive, Hadoop Analytical Platform
Worked as a big data engineer to build data pipelines for CCAR applications. The Comprehensive Capital Analysis and Review (CCAR) is an annual exercise by the Federal Reserve to ensure that institutions have well-defined and forward-looking capital planning processes that account for their unique risks and sufficient capital to continue operations through times of economic and financial stress. Particular users for this project are the U.S Federal Reserve body and the internal Risk management compliance team.
Technologies: DevOps, CI/CD, PySpark, Oracle, HiveQL, Git, QA
Worked as an ETL Developer to build ETL flows for the Customer Behavior Discovery Analytical Platform. Customer Discovery Platform is Vodafone’s customer analytics platform to analyze customer data and make decisions. Vodafone drives some analytics like new activations, deactivations, customer usage based on geographical locations, different promotions, various plans, etc.
Technologies: ETL, Informatica, Bteq, FastLoad, Unit Testing
Worked as an ETL Developer to build ETL flows for Marketing the Campaign management Analytical Platform. It was developed to analyze the effectiveness of the marketing campaigns for the incentives and offerings offered by Vodafone. The ETL process involved extracting the data from SQL Server and Oracle sources and loading them into Teradata Data Warehouse. The data was used to analyze customer opinions, loyalty, customer profiles, and customer satisfaction.
Technologies: ETL, SQL Server, Oracle, Informatica, Bteq, Teradata