................For our client from the telecommunications industry inZurich, we are looking for a very experienced, motivated and open-mindedData Engineer - CMDBYour tasks:Design, develop, and maintain data pipelines and workflows using Apache Airflow for efficient data ingestion, transformation, and loading into the CMDBDevelop and optimize PL/SQL queries and stored procedures for data manipulation and retrieval within the CMDB environmentUtilize NoSQL databases for handling and processing large volumes of configuration dataIntegrate data from various sources into the CMDB using MuleSoft and other integration platformsConduct data reconciliation activities to ensure data accuracy and consistency across multiple systems and sourcesDevelop and implement inventory data models based on the Common Information Model (CIM) to accurately represent IT assets and their relationshipsDesign and implement Extract, Transform, and Load (ETL) processes to populate and update the CMDB with accurate and up-to-date informationCollaborate with cross-functional teams to understand data requirements and ensure the CMDB meets business needsTroubleshoot and resolve data-related issues, ensuring data integrity and availabilityDocument data processes, data models, and configurations to maintain knowledge and facilitate collaborationYour profile:Proven experience in data engineering and data modelingScripting languages such as Python, Perl and OthersStrong understanding of Service Asset and Configuration Management (SACM) principles and best practices using Systems such as Microfocus Asset Manager, Peregrine Asset Center or similar (not the ITSM part)In-depth knowledge of the Common Information Model (CIM) from DMTF.Org.Proficiency in Apache Airflow for workflow orchestration and automationKnowledge of Container Solutions such as iKube 2.0 (preferred), Kubernetes or othersExtensive experience with PL/SQL for database operations and data manipulationExperience working with NoSQL databases (e.G., MongoDB)Hands-on experience with MuleSoft or other integration platformsStrong data reconciliation and data quality management skillsExpertise in inventory data modeling and implementationSolid understanding of Extract, Transform, and Load (ETL) processes using different ToolingBasic Anchor Modelling SkillsExcellent problem-solving and analytical skills and strong collaboration abilitiesFluency in English is mandatory, knowledge of German is an advantageProven experience in data engineering and data modelingScripting languages such as Python, Perl and OthersStrong understanding of Service Asset and Configuration Management (SACM) principles and best practices using Systems such as Microfocus Asset Manager, Peregrine Asset Center or similar (not the ITSM part)In-depth knowledge of the Common Information Model (CIM) from DMTF.Org.Proficiency in Apache Airflow for workflow orchestration and automationBuilding Web Frontends and Front- as well as Backend Loading MechanismKnowledge of Container Solutions such as iKube 2.0 (preferred), Kubernetes or othersExtensive experience with PL/SQL for database operations and data manipulationExperience working with NoSQL databases (e.G., MongoDB)Hands-on experience with MuleSoft or other integration platformsStrong data reconciliation and data quality management skillsExpertise in inventory data modeling and implementationSolid understanding of Extract, Transform, and Load (ETL) processes using different ToolingBasic Anchor Modelling Skills