fbpx

【AI Team】數據工程師

Website iKala

【Responsibilities】
  1. Create and maintain optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements.
  2. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  3. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  4. Work with data and analytics experts to strive for greater functionality in our data systems.
  5. Support cross functional, cross BU data integration tasks.
【Requirements】
  1. Experience with Computer Science relevant knowledge or degree in Computer Science, IT, or similar field; a Master's is a plus.
  2. Experience with python 3.8+ at least 2 years for 1 -2 completed project life cycle.
  3. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP 'big data' technologies, e.g. Airflow, Dagster.
  4. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  5. Experience with backend api development, deployment, e.g. Fast, Django, Flask.
  6. Experience with Kubernetes, docker, ex build docker image.
  7. Experience with GNU/Linux system and do deployment and debug under the environment.
  8. Experience with Google Cloud Service, e.g. GCS, BigQuery, CloudSQL, data proc.
【Pluses】
  1. Basic knowledge of machine learning algorithm.
  2. Experience with completed project lifecycle.
  3. Experience with first line or second line system operation and maintenance.

立即應徵回到職缺列表

To apply for this job email your details to amy.chen@ikala.tv