AWS Data Engineer (Advanced) 2781 TT
at Mediro ICT
Pretoria, Gauteng, South Africa -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 04 Jul, 2024 | Not Specified | 05 Apr, 2024 | N/A | Postgresql,Confluence,Communication Skills,Specifications,Analytical Skills,Working Model,Collaboration Tools,Technical Documentation,Validation | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Data Engineers are responsible for building and maintaining Big Data Pipelines using our client’s Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.
QUALIFICATIONS/EXPERIENCE:
Relevant IT / Business / Engineering Degree
Candidates with one or more of the certifications are preferred:
AWS Certified Cloud Practitioner
AWS Certified SysOps Associate
AWS Certified Developer Associate
AWS Certified Architect Associate
AWS Certified Architect Professional
Hashicorp Certified Terraform Associate
ESSENTIAL SKILLS REQUIREMENTS:
SQL - Oracle/PostgreSQL
AWS Quicksight
Business Intelligence (BI) Experience
ADVANTAGEOUS SKILLS REQUIREMENTS:
Demonstrate expertise in data modelling Oracle SQL.
Exceptional analytical skills analysing large and complex data sets.
Perform thorough testing and data validation to ensure the accuracy of data transformations.
Strong written and verbal communication skills, with precise documentation.
Self-driven team player with ability to work independently and multi-task.
Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.
Strong organisational skills. Experience in working with Enterprise Collaboration tools such as Confluence, JIRA etc.
Experience developing technical documentation and artefacts.
Experience working with Data Quality Tools such as Great Expectations.
Experience developing and working with REST API’s is a bonus.
Basic experience in Networking and troubleshooting network issues.
Knowledge of the Agile Working Model.
Creating a network community with all needed data stewards within the Group IT.
Self-driven data obtaining within the community.
ADVANTAGEOUS TECHNICAL SKILLS / TECHNOLOGY:
Terraform
Python 3x
Py Spark
Boto3
Responsibilities:
Please refer the Job description for details
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Business, Engineering, IT
Proficient
1
Pretoria, Gauteng, South Africa