Job offer

Big Data Operations Engineer

The Big Data Operations Engineer is responsible for implementing methods and solutions to strengthen data reliability and quality, working closely with data scientists and data analysts. The position involves providing daily support and troubleshooting for data pipelines, as well as ensuring data availability and quality.

Job description

Overview

  • Position: Big Data Operations Engineer
  • Location: Singapore
  • Working hours: Full-time

Tasks

  • Daily support and troubleshooting of data pipelines from ingestion to consumption
  • Ensuring that the correct thresholds for data availability are met
  • Actively support release cycles and advocate changes to avoid production disruptions
  • Maintenance and modeling of JSON-based schemas and metadata to reuse them across the enterprise (with central tools)
  • Function as a data engineer to ensure corrective measures (historization of tables, execution of dependencies, recording of quality issues, etc.)
  • Operational responsibility for Common Data Model tables in a separate access zone
  • Part of the agile setup and support of the development team
  • Managing the operating model to ensure support levels and distribution of operational tasks (e.g., monitoring service availability and performance) with Level 1 and 2 support
  • Responsibility for the provision of Level 3 support (event management such as incident and problem management) for the IT service and (together with the IT service owner) for the data-related pipelines
  • Continuous improvement of service availability, performance, capacity, and knowledge management

Regulatory responsibilities and/or risk management

  • Ensuring appropriate ethical and compliant behavior within the area of responsibility by clearly demonstrating values and behaviors, including but not limited to standards of honesty and integrity, care and diligence, fair treatment (fair treatment of customers), management of conflicts of interest, competence and continuous development, appropriate risk management, and compliance with applicable laws and regulations.

Requirements

Technical requirements

  • Degree in computer science (university, university of applied sciences or equivalent)
  • Very good communication and planning/coordination skills
  • Strong team player
  • Proactive, collaborative and customer-oriented approach to problem solving and adoption
  • Expert knowledge in processes and topics related to data ops
  • Experience with monitoring tools for platform health, daily data collection pipelines, consumer applications
  • Experience with Linux-based infrastructure, including advanced scripting
  • Expert knowledge in SQL, preferably in mixed environments (i.e. classic DWH and distributed)
  • Daily maintenance of databases and loading processes, CI-CD pipelines, etc.
  • Several years of professional experience in a similar function in the financial sector
  • Strong experience in operating and working with complex data environments
  • Good knowledge of Dataiku and Tableau (optional)

Personal and social requirements

  • Good language skills (written and oral)
  • Very high self-motivation to drive initiatives and initiate change
  • Willingness to contribute new ideas and initiate changes
  • Excellent communication skills to interact with both technical teams and business partners
  • Detail-oriented mindset with the ability to manage and effectively prioritize multiple tasks

We offer

No specific benefits or offers mentioned.

Job details

© 2025 House of Skills by skillaware. All rights reserved.
Our website uses cookies to make navigation easier for you and to analyze the use of the site. You can find more information in our privacy policy.