Extract, import, standardize, transform, convert, warehouse, archive and validate large complex sets of data
Transform data into right format as per requirements and develop data models for analytics
Combine information from different data sources to be stored in Data Lake / Data Platform for analysis, and input into advanced statistical and predictive analysis algorithms
Ensure data quality, data governance standard, security and privacy are met
Develop comprehensive guidelines required to facilitate data access, extraction and usage
Required Qualifications and Experiences :
Experienced in designing and building ETL/data pipelines, data ingestion and transformation
Bachelor’s degree in Engineering, Data Science, Computer Science, Statistics, Economics or related field
Fluent in writing effective and high-quality code in Python or R
Experienced with SQL language and writing complex queries
Possess good knowledge of cloud infrastructure and deployment tools (AWS, GCP etc.)
Deep knowledge in both SQL (MySQL, MariaDB, Postgre SQL) and NoSQL databases (Couchbase, MongoDB etc.)
Skills
MongoDB
NoSQL
Data Science
PostgreSQL
Functions
Engineering
Job Overview
Job Type:
Hybrid
Company
Arise by INFINITAS
34 active jobs
Industry:
Technology
Ready to Apply?
Submit your application now and take the next step in your career journey.