top of page

Python Engineer – ML/Data Pipelines

Python Engineer – ML/Data Pipelines

Location

Remote

Headquarter

Karlsruhe, Germany

Deadline

29. November 2025 um 08:00:00

Salary

€ 3000

Job Type

Full-time

About the project

Currently, we are looking for a Python Engineer – ML/Data Pipelines (Kubeflow, GCP) for a key client, an innovative company specializing in data-driven solutions. Focused on leveraging technology, people, and processes, the company develops and optimizes software, data infrastructure, and AI-powered tools for businesses across industries. With a collaborative, agile environment, it emphasizes continuous learning, career growth, and remote work flexibility.
We are looking for a skilled Software Engineer to build and maintain Kubeflow Pipelines on Google Cloud, ensuring seamless machine learning model integration into production. This role involves operationalizing Proof of Concept (PoC) solutions, optimizing performance, and transforming Python-based functionalities into scalable ML pipeline components.
A team of Data Scientists are creating some POC projects about online marketing segmentation or attribute enrichment, which are then “operationalized” by the team converting these to unified Kubeflow pipelines. Some testing, optimization on performance or memory usage are additionally implemented.
Size of the team: 5 consisting of project lead, 3 data/software engineers, 1 QA.


Your duties

As a Python Engineer – ML/Data Pipelines (Kubeflow, GCP), you will be responsible for:

- Developing and maintaining Kubeflow Pipelines on Google Cloud for efficient machine learning model deployment
- Collaborating with data scientists to understand and implement their requirements into production-grade pipeline components
- Analyzing and adapting Python-based data science scripts and functionalities to align with pipeline architecture
- Optimizing pipelines for scalability, performance, and reliability
- Debugging, monitoring, and troubleshooting pipeline issues in production
- Contributing to continuous improvement efforts for pipeline development processes and tools

Requirements

- 4+ years of experience with Python
- Strong knowledge and familiarity with data analysis processes
- Experience with Kubeflow Pipelines and/or similar workflow orchestration tools
- Strong understanding of machine learning concepts and workflows
- Proficiency in Google Cloud Platform (GCP) and cloud-based deployment
- Knowledge of machine learning concepts and data science workflows
- Strong debugging, problem-solving, and communication skills
- Strong communication and teamwork skills for effective collaboration with cross-functional teams
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field
- English language upper intermediate (B2) is a must

Nice to have:

- Familiarity with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn
- Experience with containerization and orchestration tools (e.g., Docker, Kubernetes)
- Knowledge of CI/CD pipelines and DevOps practices
- Experience working with databases, particularly Snowflake, and integrating them into machine learning pipelines
- Exposure to SQL and database optimization techniques for efficient data retrieval and processing

About team up

After the first Covid outbreak, considering the future of work has become increasingly crucial. At the beginning of 2020, Georgian and German entrepreneurs met in Berlin and created a solution for the new world's challenges. That’s how Team Up was established.​

Talent map

Benefits and perks of remote career with Team Up

Everything you need to level up professionally and feel respected, cared for and valued :
 

  • Hork from everywhere

  • Equipment you’ll love

  • Private health insurance 

  • Self-development programs

  • Top talent community

VIELE STANDORTE · VOLLSTÄNDIG REMOTE

Hinterlassen Sie Ihr CV

How did you hear about us
bottom of page