Job offer
Big Data Operations Engineer
The Big Data Operations Engineer is responsible for implementing methods and solutions to strengthen data reliability and quality, working closely with data scientists and data analysts. The position involves providing daily support and troubleshooting for data pipelines, as well as ensuring data availability and quality.
Job description
Tasks
- Daily support and troubleshooting of data pipelines from ingestion to consumption
- Ensuring that the correct thresholds for data availability are met
- Actively support release cycles and advocate changes to avoid production disruptions
- Maintenance and modeling of JSON-based schemas and metadata to reuse them across the enterprise (with central tools)
- Function as a data engineer to implement corrective measures (historization of tables, execution of dependencies, recording of quality issues, etc.)
- Operational responsibility for Common Data Model tables in a separate access zone
- Part of the agile setup and support of the development team
- Management of the operating model to ensure support level and task sharing of operational tasks (e.g. monitoring of service availability and performance) with 1st and 2nd level support
- Responsibility for providing Level 3 support (event management such as incident and problem management) for IT services and (together with the IT service owner) for application-specific data pipelines
- Continuous improvement of service availability, performance, capacity and knowledge management
Regulatory responsibilities and/or risk management
- Ensuring appropriate ethical and compliant behavior within the area of responsibility by clearly demonstrating appropriate values and behaviors, including but not limited to standards of honesty and integrity, care and diligence, fair treatment (fair treatment of customers), management of conflicts of interest, competence and continuous development, appropriate risk management, and compliance with applicable laws and regulations.
Requirements
Technical requirements
- Degree in computer science (university, university of applied sciences or equivalent)
- Very good communication and planning/coordination skills
- Strong team player
- Proactive, cooperative, and customer-focused approach to solving problems and promoting adoption
- Expert for processes and topics related to data ops
- Experience with monitoring tools for platform health, daily data collection pipelines, consumer applications
- Experience with Linux-based infrastructure, including advanced scripting
- Expert knowledge in SQL, preferably in mixed environments (i.e. classic DWH and distributed)
- Daily maintenance of databases and loading processes, CI-CD pipelines, etc.
- Several years of professional experience in a similar function in the financial sector
- Strong experience in operating and working with complex data environments
- Good knowledge of Dataiku and Tableau (optional)
Personal and social requirements
- Good language skills (written and oral)
- Highly self-motivated to promote initiatives and initiate change
- Willingness to contribute new ideas and initiate changes
- Excellent communication skills to interact with both technical teams and business partners
- Detail-oriented mindset with the ability to manage and effectively prioritize multiple tasks
We offer
No information available.Job details