Data Engineer (Core Data Platform team)
Your main responsibilities:
- Design, build and improve our Data Platform on GCP (including Data Observability/Data Discovery/Data Quality)
- Ensure and improve the reliability and observability of the main infrastructure of the platform (Kubernetes Clusters).
- Define and implement a unified Data Governance and infrastructure that aligns with Data Mesh concept
- Work with our data team members and stakeholders to assist with data-related technical issues and support their data infrastructure needs
- Good understanding on Data Architecture
- Experience on Data Warehouse/Data Observability/Data Discovery implementation
- Experience in MLOps
Data Engineer (Data Pipeline – Data Integration team)
Your main responsibilities:
- Design, build, optimize fault-tolerant, scalable batch/streaming data pipelines and ETL/ELT processes.
- Ensure and improve the performance, observability, quality of every component in data pipelines.
- Design & Implement Data Modelling to support a wide range of data needs and use cases
- Experience working with NoSQL databases such as MongoDB
- Experience in Data Modeling
- Good knowledge on streaming data pipelines
Data Engineer (Data Product - Scrum team)
Your main responsibilities:
- Design, build or productionize meaningful data products and any required data integrations (e.g., API, ML Solutions, Data Mart)
- Maintain and improve the reliability, observability, quality of data products.
- Provide technical/development best practices and work with our data team members and stakeholders to bring data solutions to production.
- Demonstrated hands-on experience on building/productionizing/deploying scalable Data Products (e.g., API, ML Solutions).
- Experience in MLOps
- Experience in Retail
Key Requirements:
- Bachelor’s degree or higher in related fields.
- 2+ years of experiences in relevant fields. (Fresh graduates are welcome)
- Demonstrated hands-on experience on building Data Platform/Pipelines/Products on any major cloud providers.
- Demonstrated hands-on experience on Big Data technologies like Airflow, Spark, Kafka ecosystem
- Proficient in Python, SQL
- Experience with DevOps technologies such as Containers, Kubernetes, Terraform, any CI/CD tools
- Good background on software engineering principles
- Strong sense of ownership
- Growth mindset
- Curiosity to keep up and research on up-to-date technologies to constantly improve the data platform