Job offer
DataOps Engineer
The DataOps Engineer develops and operates two key data platforms to ensure reliable, scalable, and efficient data pipeline operations and product delivery. The employee works closely with data engineers, analysts, and business partners to deliver data products that enable business insights and advanced analytics.
Job description
We are seeking a motivated and proactive DataOps Engineer to join our team. In this role, you will contribute to the development and operation of our two key data platforms to ensure reliable, scalable, and efficient data pipeline operations and product delivery. You will work closely with data engineers, analysts, and business partners to operate and support our data integration and data lakehouse platforms, delivering trusted data products to the organization in a timely manner.Main tasks
- Creation, operation, and optimization of data pipelines and workflows in our lakehouse ecosystem.
- Ensuring reliability, scalability, and performance of data operations in production environments.
- Collaborate with cross-functional teams to deliver data products that enable business insights and advanced analytics.
- Implementation and maintenance of CI/CD pipelines for data systems and workflows.
- Proactively solving problems and proposing new ideas to improve data architecture and operations.
- Contribution to automation, documentation, and implementation of best practices in DataOps (DevOps).
- Monitoring, troubleshooting, and continuous improvement of data systems and processes.
- Participation in on-call duties.
Requirements
Mindset & Soft Skills
- Strong motivation, curiosity, and willingness to learn.
- Proactive, autonomous, and solution-oriented.
- Ability to take initiative and contribute new ideas.
Technical skills
- Strong knowledge of SQL (ideally Microsoft SQL Server).
- Good experience with data streaming patterns.
- Good experience with Python and scripting languages; PySpark is a strong plus.
- Good experience with containerization (Docker, Kubernetes) and cloud platforms.
- Proven experience with CI/CD pipelines.
- Familiarity with lakehouse architectures (Delta Lake, Iceberg, etc.).
- Understanding of data product concepts and modern data management practices.
- Ability to read and understand Java, Camell, Talend ESB, and Powershell.
Nice to Have
- Experience with MS Fabric or Databricks.
- Experience with observability.
- Knowledge of data governance and security best practices.
- Familiarity with BI or analytics tools.
We offer
- An opportunity to work with modern data technologies in a collaborative and future-oriented environment.
- The opportunity to shape and influence our DataOps practices and architecture.
Job details