Job offer

Senior Data Engineer 100% (f/m/d)

We are looking for a Senior Data Engineer to support the development of a Python-based enterprise data hub and drive the MLOps infrastructure forward. The role combines DevOps excellence with hands-on machine learning engineering to deliver scalable, reliable, and auditable ML solutions.

Tasks

  • Support in developing a Python-based enterprise data hub (integrated with Oracle) and further developing the MLOps infrastructure
  • Automation of CI/CD pipelines for data and ML workloads
  • Accelerating model deployment
  • Ensuring system stability
  • Implementation of infrastructure as code and ML
  • Development, training, and evaluation of machine learning models (e.g., with scikit-learn, xgboost, PyTorch) in close collaboration with data scientists
  • Orchestration of end-to-end ML workflows, including preprocessing, training, hyperparameter tuning, and model validation
  • Deployment and execution of models in production using containerized microservices (Docker/K8s) and REST/gRPC APIs
  • Managing the MLOps lifecycle using tools such as MLflow (experiment tracking, model registry) and implementing monitoring for drift, degradation, and performance
  • Refactoring exploratory code (e.g., Jupyter notebooks) into robust, testable, and versioned production pipelines
  • Collaborate with data engineers to deploy and optimize the data hub to ensure reliable data flows for training and inference.
  • Resolving operational issues across infrastructure, data, and model layers; participating in response and root cause analysis

Requirements

  • Technical expertise: Strong skills in Python, Linux, CI/CD, Docker, Kubernetes, and MLOps tools (e.g., MLflow)
  • Practical experience with Oracle databases, SQL, and ML frameworks
  • ML engineering capability: Ability to own the entire ML lifecycle—from training and evaluation to deployment and monitoring—with a focus on reproducibility and compliance.
  • Automation and reliability: Commitment to building stable, self-healing systems with proactive monitoring and automatic recovery
  • Collaboration and communication: Effective team player in agile, cross-functional environments; able to communicate clearly to technical and non-technical audiences

Education and skills

  • Education: Bachelor of Science (BS) in computer science, engineering, data science, or a related field
  • Certifications such as CKA, AWS/Azure DevOps Engineer, or Google Cloud Professional DevOps Engineer are advantageous.
  • Technical skills: Proficient in Python, Git, and shell scripting
  • Experience with CI/CD pipelines (GitLab, Jenkins), Docker, and Kubernetes

We offer

No information available

Job details

© 2025 House of Skills by skillaware. All rights reserved.
Our website uses cookies to make navigation easier for you and to analyze the use of the site. You can find more information in our privacy policy.