Job Description
We are seeking a skilled Data Engineer to design and implement modern data architectures, leveraging their expertise in ETL/ELT pipelines and stream-processing systems.
Responsibilities
* Data Pipeline Development
o Design high-performance data pipelines using ETL/ELT tools.
o Develop and maintain efficient data processing workflows.
* Stream-Processing Expertise
o Leverage knowledge of stream-processing frameworks and distributed databases.
o Advise on optimal solutions for real-time data processing.
* Leadership and Training
o Lead workshops on Kafka, Flink, and MongoDB for team members.
o Perform performance tuning of CDC pipelines.
Qualifications
1. Technical Skills
o Strong programming skills in languages such as Java, Python, or Scala.
o Experience with ETL/ELT tools like Apache Beam, AWS Glue, or Azure Data Factory.
o Familiarity with distributed OLTP/OLAP/hybrid databases like MySQL, PostgreSQL, or Oracle.
o Knowledge of stream-processing systems like Apache Kafka, Apache Flink, or Amazon Kinesis.
2. Soft Skills
o Good command of English or German.
o Curious personality with a passion for learning and problem-solving.
Environment
1. Remote Work Options
o Fully or partially remote work from Switzerland is possible.
2. Flexible Work Schedule
o No fixed working hours; work at your productive rhythm.
3. Collaborative Team
o Work with experienced IT professionals with over 15 years of experience.
4. Relocation Support
o Relocation assistance offered to facilitate a smooth transition.
Our Values
1. Trust and Autonomy
o We trust our employees 100% and empower them to make decisions.
2. Democratic Decision-Making
o Decisions are made democratically, regardless of job title or position.
3. Teamwork and Collaboration
o We work as a team, valuing the collective over individual interests.
4. Loyalty and Trust
o Loyalty and trust are earned daily through hard work and dedication.
If interested, we encourage you to apply now.