About the project
Currently, we are looking for a Python Engineer – ML/Data Pipelines (Kubeflow, GCP) for a key client, an innovative company specializing in data-driven solutions. Focused on leveraging technology, people, and processes, the company develops and optimizes software, data infrastructure, and AI-powered tools for businesses across industries. With a collaborative, agile environment, it emphasizes continuous learning, career growth, and remote work flexibility.
We are looking for a skilled Software Engineer to build and maintain Kubeflow Pipelines on Google Cloud, ensuring seamless machine learning model integration into production. This role involves operationalizing Proof of Concept (PoC) solutions, optimizing performance, and transforming Python-based functionalities into scalable ML pipeline components.
A team of Data Scientists are creating some POC projects about online marketing segmentation or attribute enrichment, which are then “operationalized” by the team converting these to unified Kubeflow pipelines. Some testing, optimization on performance or memory usage are additionally implemented.
Size of the team: 5 consisting of project lead, 3 data/software engineers, 1 QA.
Your duties
As a Python Engineer – ML/Data Pipelines (Kubeflow, GCP), you will be responsible for:
- Developing and maintaining Kubeflow Pipelines on Google Cloud for efficient machine learning model deployment
- Collaborating with data scientists to understand and implement their requirements into production-grade pipeline components
- Analyzing and adapting Python-based data science scripts and functionalities to align with pipeline architecture
- Optimizing pipelines for scalability, performance, and reliability
- Debugging, monitoring, and troubleshooting pipeline issues in production
- Contributing to continuous improvement efforts for pipeline development processes and tools
Requirements
- 4+ years of experience with Python
- Strong knowledge and familiarity with data analysis processes
- Experience with Kubeflow Pipelines and/or similar workflow orchestration tools
- Strong understanding of machine learning concepts and workflows
- Proficiency in Google Cloud Platform (GCP) and cloud-based deployment
- Knowledge of machine learning concepts and data science workflows
- Strong debugging, problem-solving, and communication skills
- Strong communication and teamwork skills for effective collaboration with cross-functional teams
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field
- English language upper intermediate (B2) is a must
Nice to have:
- Familiarity with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn
- Experience with containerization and orchestration tools (e.g., Docker, Kubernetes)
- Knowledge of CI/CD pipelines and DevOps practices
- Experience working with databases, particularly Snowflake, and integrating them into machine learning pipelines
- Exposure to SQL and database optimization techniques for efficient data retrieval and processing
About team up
At Team Up, we empower top professionals to build remote careers with international companies, all while working from their homelands. Since 2020, we've connected over 500 talents with global companies, creating opportunities that bridge borders and fuel local growth. What began as a partnership between Georgia and Germany has now expanded to 7 countries, driven by a shared vision of connection, growth, and a better future for work.

Benefits and perks of remote career with Team Up
Everything you need to level up professionally and feel respected, cared for and valued
MULTIPLE LOCATIONS · FULLY REMOTE