Ing. Anjum Monga
Angestellt, Data Architect | Senior Data Engineer, MediaMarktSaturn
Abschluss: Computer Science and Engineering, Punjabi University
Nürnberg, Deutschland
Über mich
- Working professional with almost 10 years of experience in Data Engineering and Data Architecture. Started with On-premises Hadoop Cluster, Moved to Azure and then to GCP. - Experienced in mentoring junior Data engineers , conducted peer-peer reviews, creating standards and nomenclatures. - Experienced in developing a multi hop data lake architecture to facilitate historisation, standard retrieval and CDC to the entire organisation. - Worked extensively with PII data, defined data classification levels and defined a framework to store and encrypt and decrypt the data on rest and data on go. - Conceptualised and implemented data quality rules for various domains. - Have extensive experience in Data vault modeling. Experienced in collaborating and transforming multiple data sources in Hubs, Links and Satellite. - Experienced in ELT, data wrangling jobs for various data sources. - Conducted gap analysis in the data model using data virtualisation via Denodo.
Werdegang
Berufserfahrung von Anjum Monga
Bis heute 2 Jahre und 10 Monate, seit Sep. 2021
Data Architect | Senior Data Engineer
MediaMarktSaturn• Developed data warehouse following Kimball's warehousing concept to built the Omini Channel Spine for better customer experience with clear information for sales manager to optimise the overall sales. • Designed and implemented a 3-tier architecture in Dataform and github using terraform and conducted ELT on Big Query for a data warehouse. • Designed and implemented the data quality dimensions and checks for various different domains like Sales, Product, Outlet etc.
As a data engineering chapter lead, Conducted in depth peer reviews, conducted, collaborations and workshops for exchange within and outside the domain. Designed and developed a data lake architecture for efficient storage and retrieval of large datasets and provide the standardized consumption, historization and conducted ELT on it. Participated in business understanding to transform the Non-functional requirements to functional requirements. With data virtualisation to conducted gap analysis in data model
• Implemented data vault data warehousing strategy for trucks domain, implemented the hubs , links , satellites and reference tables to bring various data sources such as sensor data, vehicle health data. • Created ELT pipelines to onboard various structured and semi structured data sources at data lake. • Implemented Hive external tables with data being placed at HDFS and metadata managed by Presto. • Implemented orchestration jobs on oozie and scheduled it with co-coordinator workflows.
4 Monate, Mai 2016 - Aug. 2016
Hadoop trainer
Ufaber
Ausbildung von Anjum Monga
4 Jahre und 1 Monat, März 2010 - März 2014
Bachelor of Technology
Punjabi University
Sprachen
Englisch
Muttersprache