Develop and maintain the existing integration framework to handle routine data pipelines in the Data Platform.
Define and implement the data pipelines (ingestion and transformation) for real-time and batch processing across layers in the Data Platform.
Maintain a good understanding of Logical/Physical data models and the AIA business glossary.
Participate in performance tuning to continually improve the quality of data platform
Work collaboratively with the data governance team to maintain data integrity and data availability across data domains.
Required Skills:
Bachelor’s degree or higher in Computer Engineering, Computer Science, Information Technology, or related fields.
Have strong programming skills in Python, PySpark, SQL. And hands-on knowledge in SDLC or Agile methodology.
At least 10 years of working experience in Big Data technology and Cloud platforms such as Azure, AWS, or GCP. A hands-on experience would be advantageous.
At least 10 years of experience in building and managing data pipeline for real-time and batch with large-scale data in the distributed data platform (Data Warehouse or Data Lake).
Have hands-on experience in Databricks, Kubernetes, Kafka, and Jenkins is preferred.
Strong analytical thinking, problem-solving, and interpersonal skills.
Good communication in Thai and English.
Skills
Information Technology
AWS
Python
Azure
Functions
Information Technology (IT)
Job Overview
Job Type:
Hybrid
Company
AIA Thailand
45 active jobs
Industry:
Banking & Finance
Ready to Apply?
Submit your application now and take the next step in your career journey.