프로젝트 소개
Currently, we are looking for a Python Engineer – ML/Data Pipelines (Kubeflow, GCP) for a key client, an innovative company specializing in data-driven solutions. Focused on leveraging technology, people, and processes, the company develops and optimizes software, data infrastructure, and AI-powered tools for businesses across industries. With a collaborative, agile environment, it emphasizes continuous learning, career growth, and remote work flexibility.
We are looking for a skilled Software Engineer to build and maintain Kubeflow Pipelines on Google Cloud, ensuring seamless machine learning model integration into production. This role involves operationalizing Proof of Concept (PoC) solutions, optimizing performance, and transforming Python-based functionalities into scalable ML pipeline components.
A team of Data Scientists are creating some POC projects about online marketing segmentation or attribute enrichment, which are then “operationalized” by the team converting these to unified Kubeflow pipelines. Some testing, optimization on performance or memory usage are additionally implemented.
Size of the team: 5 consisting of project lead, 3 data/software engineers, 1 QA.
당신의 의무
As a Python Engineer – ML/Data Pipelines (Kubeflow, GCP), you will be responsible for:
- Developing and maintaining Kubeflow Pipelines on Google Cloud for efficient machine learning model deployment
- Collaborating with data scientists to understand and implement their requirements into production-grade pipeline components
- Analyzing and adapting Python-based data science scripts and functionalities to align with pipeline architecture
- Optimizing pipelines for scalability, performance, and reliability
- Debugging, monitoring, and troubleshooting pipeline issues in production
- Contributing to continuous improvement efforts for pipeline development processes and tools
요구 사항
- 4+ years of experience with Python
- Strong knowledge and familiarity with data analysis processes
- Experience with Kubeflow Pipelines and/or similar workflow orchestration tools
- Strong understanding of machine learning concepts and workflows
- Proficiency in Google Cloud Platform (GCP) and cloud-based deployment
- Knowledge of machine learning concepts and data science workflows
- Strong debugging, problem-solving, and communication skills
- Strong communication and teamwork skills for effective collaboration with cross-functional teams
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field
- English language upper intermediate (B2) is a must
Nice to have:
- Familiarity with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn
- Experience with containerization and orchestration tools (e.g., Docker, Kubernetes)
- Knowledge of CI/CD pipelines and DevOps practices
- Experience working with databases, particularly Snowflake, and integrating them into machine learning pipelines
- Exposure to SQL and database optimization techniques for efficient data retrieval and processing
팀업에 대하여
Team Up에서는 최고의 전문가들이 자국에서 근무하면서도 글로벌 기업에서 원격으로 커리어를 쌓을 수 있도록 지원합니다. 2020년부터 500명이 넘는 인재를 글로벌 기업과 연결하여 국경을 넘나드는 기회를 창출하고 지역 성장을 촉진해 왔습니다. 조지아와 독일의 파트너십으로 시작된 이 파트너십은 이제 연결, 성장, 그리고 더 나은 미래라는 공동의 비전을 바탕으로 7개국으로 확대되었습니다.

Team Up과 함께하는 원격 근무의 이점과 특전
전문적으로 성장하고 존중받고, 보살핌을 받고, 소중하게 여겨진다고 느끼는 데 필요한 모든 것
여러 위치 · 완전 원격