Job offer

Big Data Operations Engineer

The Big Data Operations Engineer is responsible for implementing methods and solutions to strengthen data reliability and quality, working closely with data scientists and data analysts. The position involves providing daily support and troubleshooting for data pipelines, as well as ensuring data availability and quality.

Job description

Tasks

  • Daily support and troubleshooting of data pipelines from ingestion to consumption
  • Ensuring that the correct thresholds for data availability are met
  • Actively support release cycles and advocate changes to avoid production disruptions
  • Maintenance and modeling of JSON-based schemas and metadata to reuse them across the enterprise (with central tools)
  • Function as a data engineer to ensure corrective actions (historization of tables, execution of dependencies, detection of quality issues, etc.)
  • Operational responsibility for Common Data Model tables in a separate access zone
  • Part of the agile setup and support of the development team
  • Management of the operating model to ensure support levels and division of operational tasks (e.g., monitoring service availability and performance) with first and second level support
  • Responsibility for providing Level 3 support (event management such as incident and problem management) for IT services and (together with the IT service owner) for application-specific data pipelines
  • Continuous improvement of service availability, performance, capacity, and knowledge management

Regulatory responsibilities and/or risk management

  • Ensure appropriate ethical and compliant behavior within the area of responsibility by clearly demonstrating appropriate values and behaviors, including but not limited to standards of honesty and integrity, care and diligence, fair treatment (fair treatment of clients), management of conflicts of interest, competence and continuous development, appropriate risk management and compliance with applicable laws and regulations

Requirements

Professional requirements

  • Degree in computer science (university, university of applied sciences or equivalent)
  • Very good communication and planning/coordination skills
  • Strong team player
  • Proactive, cooperative, and customer-focused approach to solving problems and promoting adoption
  • Expert knowledge in processes and topics related to data ops
  • Experience with monitoring tools for platform health, daily data collection pipelines, consumer applications
  • Experience with Linux-based infrastructure, including advanced scripting
  • Expert knowledge in SQL, preferably in mixed environments (i.e. classic DWH and distributed)
  • Daily maintenance of databases and loading processes, CI-CD pipelines, etc.
  • Several years of professional experience in a similar function in the financial sector
  • Strong experience in operating and working with complex data environments
  • Good knowledge of Dataiku and Tableau (optional)

Personal and social requirements

  • Good language skills (written and oral)
  • Highly self-motivated to promote initiatives and initiate change
  • Willingness to contribute new ideas and initiate changes
  • Excellent communication skills to interact with both technical teams and business partners
  • Detail-oriented mindset with the ability to manage and effectively prioritize multiple tasks

We offer

(no specific information available)

Job details

© 2025 House of Skills by skillaware. All rights reserved.
Our website uses cookies to make navigation easier for you and to analyze the use of the site. You can find more information in our privacy policy.