Job offer
Big Data Operations Engineer
The Big Data Operations Engineer is responsible for implementing methods and solutions to strengthen data reliability and quality and works closely with Data Scientists and Data Analysts. The position involves day-to-day support and troubleshooting of data pipelines as well as ensuring data availability and modeling data structures.
Job description
Tasks
- Daily support and troubleshooting of data pipelines from ingestion to consumption
 - Ensuring that the correct thresholds for data availability are met
 - Actively support release cycles and advocate changes to avoid production disruptions
 - Maintenance and modeling of JSON-based schemas and metadata to reuse them across the enterprise (with central tools)
 - Function as a data engineer to ensure corrective actions (historization of tables, execution of dependencies, detection of quality issues, etc.)
 - Operational responsibility for Common Data Model tables in a separate access zone
 - Part of the agile setup and support of the development team
 - Management of the operating model to ensure support level and task sharing of operational tasks (e.g. monitoring of service availability and performance) with 1st and 2nd level support
 - Responsibility for the provision of Level 3 support (event management such as incident and problem management) for the IT service and (together with the IT service owner) for the data-related pipelines
 - Continuous improvement of service availability, performance, capacity and knowledge management
 
Regulatory responsibilities and/or risk management
- Ensure appropriate ethical and compliant behavior within the area of responsibility by clearly demonstrating appropriate values and behaviors, including but not limited to standards of honesty and integrity, care and diligence, fair treatment (fair treatment of clients), management of conflicts of interest, competence and continuous development, appropriate risk management and compliance with applicable laws and regulations
 
Requirements
Professional requirements
- Degree in computer science (university, university of applied sciences or equivalent)
 - Very good communication and planning/coordination skills
 - Strong team player
 - Proactive, collaborative and customer-oriented approach to problem solving and adoption
 - Expert for processes and topics related to data ops
 - Experience with monitoring tools for platform health, daily data collection pipelines, consumer applications
 - Experience with Linux-based infrastructure, including advanced scripting
 - Expert knowledge in SQL, preferably in mixed environments (i.e. classic DWH and distributed)
 - Daily maintenance of databases and loading processes, CI-CD pipelines, etc.
 - Several years of professional experience in a similar function in the financial sector
 - Strong experience in operating and working with complex data environments
 - Good knowledge of Dataiku and Tableau (optional)
 
Personal and social requirements
- Good language skills (written and oral)
 - Very high self-motivation to drive initiatives and initiate change
 - Willingness to contribute new ideas and initiate changes
 - Excellent communication skills to interact with both technical teams and business partners
 - Detail-oriented mindset with the ability to manage and effectively prioritize multiple tasks
 
We offer
No information available.Job details