๐๐ฏ๐ผ๐๐ ๐๐ต๐ฒ ๐๐ผ๐บ๐ฝ๐ฎ๐ป๐:
Our client is one of the worlds leading independent commodity trading companies, operating worldwide. Known for innovation and integrity, the company is a market leader in the trade, transport, storage, and optimization of energy products. This is an opportunity to work in a global, dynamic environment at the heart of the energy sector.
๐ฅ๐ผ๐น๐ฒ ๐ข๐๐ฒ๐ฟ๐๐ถ๐ฒ๐:
This position is a high-priority engagement focused on developing business-facing capabilities, particularly the foundational data needed by the research and trading teams. The Data Engineer will operate within the Hub and Spokes operational model, likely embedded within a Spoke Pod in Geneva to ensure proximity with the business and direct support for the critical trading desk project.
๐๐ฒ๐ ๐ฅ๐ฒ๐๐ฝ๐ผ๐ป๐๐ถ๐ฏ๐ถ๐น๐ถ๐๐ถ๐ฒ๐:
โข Time-Series & Curve Mastery: Design, implement, and maintain complex data structures for trading, mastering multi-feed validation, unit/calendars/normalization, versioned curve stores, and scenario/scalarisation to support quant strategies.
โข Streaming Data Pipelining: Architect and implement scalable data ingestion and processing systems, specializing in streaming at scale with technologies like Kafka/Flink, ensuring robust handling of stateful operations, exactly-once semantics, and late/out-of-order data.
โข Industrialization and Discipline: Enforce expert programming discipline, including strong testing discipline (unit/contract/property-based), packaging, CI/CD, and GitOps practices.
โข Data Quality and Orchestration: Utilize orchestration tools (e.g., Airflow, Data Factory, Dagster) and enforce data quality contracts using dbt/Spark/SQL. Pipelines must aim for a Data Quality Compliance > 95%.
โข Platform Reusability: Develop reusable ETL/ELT templates, contributing to the goal of achieving a Template Reuse Rate > 70%.
๐๐ผ๐ฟ๐ฒ ๐ค๐๐ฎ๐น๐ถ๐ณ๐ถ๐ฐ๐ฎ๐๐ถ๐ผ๐ป๐:
โข Bachelor's or Master's degree in Computer Science, Engineering, or a quantitative discipline.
โข At least 5 years of experience as a Data Engineer, ideally in a cloud-native environment.
โข Prior experience in a trading, commodity, energy, or financial services environment.
โข Strong expertise in Azure Data Factory, Azure Synapse, SQL Server, and Azure Data Lake.
โข Solid experience with SQL and ETL/ELT development.
โข Proven experience with data modeling (fact/dimension tables), data warehouse design, and BI support.
โข Comfortable working with big data sets and optimizing data delivery and pipeline performance.
โข Fluent in English (written and spoken) is mandatory.
โข Excellent analytical, problem-solving, and communication skills.
๐๐ฒ๐๐ถ๐ฟ๐ฎ๐ฏ๐น๐ฒ ๐ฆ๐ธ๐ถ๐น๐น๐:
โข
Domain Fluency:
Trades/orders, ETRM/CTRM integration patterns (e.g., using CDC/event adapters), positions/exposures maths, NEAR REAL TIME/telemetry.
โข Knowledge of data governance, data quality, and metadata management.
โข Azure certification (e.g., Azure Data Engineer Associate) is a plus.
โข Experience in commodity trading or energy industries is a strong advantage.
๐ฌ๐ผ๐๐ฟ ๐๐ฎ๐๐ฎ:
By submitting your resume, you agree to the retention and use of your personal data by TSG for recruitment purposes, including sharing with our clients in the context of your application.