Key Roles and Responsibilities:
- Architect Data Solutions: Lead the design and implementation of robust and scalable data pipelines and architectures that are optimized for Business Intelligence and Data Science projects.
- Data Management: Oversee the development and optimization of data management solutions to ensure data quality, integrity, and security across the organization.
- Cloud Data Platforms: Lead the deployment and management of data infrastructure on cloud platforms (GCP or AWS). Utilize cloud services for efficient data processing, storage, and analytics.
- Cost Optimization: Implement best practices to ensure cost optimization of the cloud platform. Continuously monitor and manage costs associated with cloud services.
- Team Leadership: Provide technical leadership and mentorship to junior data engineers. Foster a culture of collaboration, innovation, and continuous improvement.
- Strategic Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data needs and deliver solutions that meet business requirements.
- Performance Optimization: Monitor, troubleshoot, and optimize data pipelines and databases for performance, reliability, and cost efficiency.
- Data Governance and Compliance: Establish and maintain data governance frameworks and ensure compliance with the Personal Data Protection Act (PDPA) and other relevant regulations.
- Documentation: Maintain comprehensive documentation of data pipelines, architectures, and processes to ensure knowledge sharing and continuity.
- Innovation: Stay abreast of the latest industry trends, tools, and technologies in data engineering and cloud platforms, and implement innovative solutions.
Essential Experience, Skills and Knowledge:
- Education: Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field.
- Experience: Minimum of 5 years of experience in data engineering, with a strong focus on data pipeline development and data management.
- Experience: Minimum of 8 + years of hands-on experience in data engineering, with a strong focus on data pipeline development and data management.
- Cloud Platforms: Extensive experience with cloud data platforms such as Google Cloud Platform (GCP)
- Data Lake Expertise: In-depth experience in designing and managing data lakes.
- Programming Skills: Proficiency in programming languages such as Python, Java, or Scala. Strong experience with SQL for data manipulation and querying.
- Data Tools: Deep knowledge of data processing frameworks such as Apache Spark, Hadoop, or similar technologies.
- Database Systems: Expertise in relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
- ETL Processes: Proven experience in designing and implementing ETL (Extract, Transform, Load) processes.
- ML Ops: Experience with Machine Learning Operations (ML Ops) to streamline and scale machine learning workflows.
- Analytical Skills: Exceptional analytical and problem-solving skills with attention to detail.
- Data Governance and Compliance: Strong knowledge of data governance frameworks and experience in ensuring compliance with PDPA and other relevant regulations.
- Communication: Excellent communication skills in both English & Thai