GoldenPeaks Capital is a leading renewable energy company specializing in the development, construction, and operation of solar power systems. With over 15 years of experience in structuring energy projects worldwide, we are among the largest owners of photovoltaic systems in Poland and Hungary. As a pioneer in Eastern Europe, we are introducing Battery Energy Storage Systems (BESS) combined with solar, including multiple pilot projects in Poland and Hungary.
Our end-to-end model spans development, engineering, financing/structuring, supply chain, construction/commissioning, asset operations, and energy sales ensuring efficiency and the highest standards across the value chain. GoldenPeaks Capital’s commitment to sustainability has been recognized with the highest sustainability quality score (SQS1) from Moody’s for our green bond framework.
The Role
We are seeking a Data Engineer – Industrial IoT & Predictive Analytics to play a key role in the development, and scaling of our enterprise Asset Performance Management (APM) platform.
You will be responsible for building and operating high-availability, high-frequency industrial data systems that collect, process, and analyse real-time telemetry from hundreds of solar power plants. Your work will directly support operational excellence, revenue optimisation, and the deployment of predictive maintenance capabilities across the portfolio.
This role sits at the intersection of cloud data engineering, advanced analytics and industrial IoT, with direct exposure to executive-level decision support and asset management.
Contract duration: 6-9 months (with possible extension)
Key Responsibilities
1. Real-Time Data & Telemetry
* Design and operate high-frequency data ingestion pipelines handling millions of data points daily
* Build robust data processing workflows that normalize and enrich heterogeneous telemetry into analytics- and ML-ready datasets
* Implement automated data quality checks, anomaly detection, and alerting mechanisms
2. Data Flow Architecture & Orchestration
* Own end-to-end data flow design, from edge ingestion through cloud processing to downstream analytics and reporting
* Define and implement clear data contracts, schemas, and lineage across systems to ensure consistency and traceability
* Orchestrate complex multi-stage data pipelines supporting near-real-time monitoring, historical analysis, and machine learning workloads
3. Cloud-Native Data Platform
* Architect hybrid data platforms combining time-series databases (e.g. Azure Data Explorer) with relational data stores
* Develop scalable APIs to integrate operational data with enterprise systems
* Use Infrastructure as Code (Terraform) to deliver reproducible and automated cloud environments
4. Analytics & Predictive Insights
* Enable scalable dashboarding and reporting across hundreds of installations
* Support real-time performance analysis against production targets and financial forecasts
* Collaborate on predictive maintenance and forecasting models using machine learning technique
Requirements
* Data Engineering: 3+ years designing and operating production data pipelines (streaming and batch)
* Cloud Platforms: Strong experience with Azure (or AWS/GCP with willingness to learn Azure)
* Programming: Proficiency in Python, Go, or similar languages
* Databases: Experience with SQL and NoSQL systems, including time-series data
* Infrastructure as Code: Hands-on experience with Terraform or equivalent tools
Highly Valued Skills
* Time-series analytics (Azure Data Explorer or similar)
* Applied machine learning for anomaly detection, forecasting, or predictive maintenance
* Experience with IoT or high-frequency telemetry data
* Knowledge of renewable energy, power systems, or asset operations
* Security‑aware engineering mindset (identity, encryption, network controls)
#J-18808-Ljbffr