Start Date
Immediate
Expiry Date
05 May, 25
Salary
0.0
Posted On
05 Feb, 25
Experience
2 year(s) or above
Remote Job
No
Telecommute
No
Sponsor Visa
No
Skills
Qlik, Etl, Unstructured Data, Scratch, Data Integration, Bitbucket, Sql, Cloud, Data Engineering, Github
Industry
Information Technology/IT
ESSENTIAL SKILLS:
? 10+ years of experience with Data Warehouse / Data Platforms
5+ years of experience creating ELT data pipelines from scratch, working with structured, semi-structured, and unstructured data and SQL.
2+ years of experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte or others.
5+ years of experience with Cloud: GCP
5+ years of experience working as a data developer, data engineering, programming, ETL, ELT, processes for data integration.
5+ years continuous integrations and continuous deployment pipeline (CI/CD) and working with source control systems such as Github, Bitbucket, and Terrafor
Experience creating ELT data pipelines from scratch
Experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte etc.
Experience in data modelling, manipulating large data sets and handling raw SQL, and handling other cleaning techniques.
Experience working with structured, semi-structured, and unstructured data.?
Experience building data pipelines, and composable cloud-based data platforms in AWS, Azure, or GCP.
Experience collaborating and working with DevOps and Scrum Teams
Have prior experience with data developer, data engineering, programming, ETL, ELT, processes for data integration.
Demonstrated team player with strong communication skills and a track record of successful delivery of product development.
Expert at problem solving. Good understanding of continuous integrations and continuous deployment pipeline (CI/CD)
Strong scripting skills? Experience working with source control systems such as Github, Bitbucket.