Job Summary
The Senior Security Data Engineer will lead the design and implementation of scalable, secure, and reusable SIEM data ingestion and transformation pipelines across a complex cybersecurity telemetry environment.
This role focuses on building platform-agnostic ingestion frameworks, schema normalization, and high-Client data delivery to downstream analytics and security platforms.
Responsibilities
* Lead architecture, design, and implementation of scalable and reusable data pipelines using Cribl, Apache NiFi, Vector, and other open-source platforms.
* Develop platform-agnostic ingestion frameworks and template-driven architectures for reusable ingestion patterns.
* Support multiple input types including syslog, Kafka, HTTP, Event Hubs, and Blob Storage.
* Support multiple output destinations including Snowflake, Client, ADX, Log Analytics, and Anvilogic.
* Design and implement schema normalization using the Open Cybersecurity Schema Framework (OCSF).
* Build field mappings, transformation templates, and schema validation logic that are portable across ingestion platforms.
* Develop custom data transformations and enrichments using Groovy, Python, or JavaScript.
* Enforce governance and security controls including SSL/TLS, client authentication, input validation, and logging.
* Ensure end-to-end data traceability and lineage using metadata tagging, correlation IDs, and change tracking.
* Integrate pipeline health monitoring, transformation failure logging, and anomaly detection with observability teams.
* Validate data integrations to ensure high-quality delivery with minimal data loss or duplication.
* Lead technical working sessions to evaluate tools and best practices for large-scale security telemetry data.
* Implement filtering, enrichment, dynamic routing, and format conversions such as JSON, CSV, XML, and Logfmt.
* Maintain centralized documentation covering ingestion patterns, transformation libraries, schemas, and governance standards.
* Collaborate with security, analytics, and platform teams to support threat detection, compliance, and analytics use cases.
Requirements
* Ten or more years of experience working in cybersecurity.
* Five or more years of experience with data pipeline platforms such as Cribl, Vector, Datadog, Client, or similar tools.
* Five or more years of experience with scripting languages such as JavaScript or Python.
Skills
* Security telemetry ingestion and transformation.
* SIEM data pipeline architecture.
* Schema normalization and OCSF.
* Data governance, lineage, and audit readiness.
* Scripting and automation for data processing.
Experience
* Extensive experience designing and managing large-scale security data pipelines supporting over one hundred data sources.
Qualification and Education
* Bachelor’s degree in Computer Science, Information Security, or a related field, or equivalent professional experience.
Pay Range: $55hr - $60hr
#J-18808-Ljbffr