Role and Responsibilities / หน้าที่ความรับผิดชอบ
- Design, build, and manage scalable data infrastructures to process and analyze large volumes of data from many business units.
- Collaborate with stakeholders and build trusted, reliable, innovative data products and insights to meet business needs.
- Develop an enterprise-grade data platform to minimize the time from question to insight by implementing efficient processes and enabling self-service analytics.
- Secure our data into a uniform, trusted asset through data protection and privacy, iterating on processes, people, and platforms.
- Improve data literacy across the KKP organization and build a powerful data-driven culture.
Qualifications / คุณสมบัติ
- You have proven experience delivering data solutions from start to finish independently.
- You have a strong background in at least one of the following: large data processing, software engineering of data infrastructure, or data modeling.
- You are resilient and have the grit to solve problems regardless of obstacles.
- You enjoy collaborating with teams to push the boundaries of analytical insights and data products and have good data intuition to build the right solutions.
- Your work exemplifies good engineering fundamentals, including testable code, efficient development cycles, and excellent documentation.
Specific knowledge and skill / ความรู้เฉพาะตำแหน่ง
- You are proficient in at least one major programming language (e.g., Python, Scala, Java) and comfortable working with SQL.
- You are familiar with big data technologies like Spark or Flink. Bonus point if you have experience working with popular data processing platforms such as Databricks or Snowflake.
- You are familiar with job orchestration tools like Airflow, Azure Data Factory, Luigi, or Mage.