AWS Data Engineer (Contract) - Gauteng/Hybrid
at Full Circle Resourcing
Midrand, Gauteng, South Africa -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 27 May, 2024 | Not Specified | 01 Mar, 2024 | 6 year(s) or above | Collaboration Tools,Validation,Working Experience,Avro,Postgresql,Xml,Analytical Skills,Json,Specifications,Glue,Confluence,Technical Documentation,Software Design Patterns,Communication Skills | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Our client requires the services of a Data Engineer/Scientist (Senior) - Midrand/Menlyn/Rosslyn/Home Office rotation.
- Amazing brand with cutting-edge technology
- Excellent teams in Global team collaboration
- High work-life balance with Flexible hours
- Agile working environment
POSITION: Contract until December 2026
EXPERIENCE: 6-8 YEARS RELATED WORKING EXPERIENCE.
COMMENCEMENT: As soon as possible
QUALIFICATIONS/EXPERIENCE
- South African citizens/residents are preferred.
- Relevant IT / Business / Engineering Degree
- Candidates with one or more of the certifications are preferred.
- AWS Certified Cloud Practitioner
- AWS Certified SysOps Associate
- AWS Certified Developer Associate
- AWS Certified Architect Associate
- AWS Certified Architect Professional
- Hashicorp Certified Terraform Associate
ESSENTIAL SKILLS:
- Terraform
- Python 3x
- SQL - Oracle/PostgreSQL
- Py Spark
- Boto3
ADVANTAGEOUS TECHNICAL SKILLS
- Demonstrate expertise in data modelling Oracle SQL.
- Exceptional analytical skills analysing large and complex data sets.
- Perform thorough testing and data validation to ensure the accuracy of data transformations.
- Strong written and verbal communication skills, with precise documentation.
- Self-driven team player with ability to work independently and multi-task.
- Experience in working with Enterprise Collaboration tools such as Confluence, JIRA etc.
- GROUP Cloud Data Hub (CDH)
- GROUP CDEC Blueprint
- Experience developing technical documentation and artefacts.
- Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc.
- Experience working with Data Quality Tools such as Great Expectations.
- Experience developing and working with REST API’s is a bonus.
- Experience building data pipeline using AWS Glue or Data Pipeline, or similar platforms.
- Familiar with data store such as AWS S3, and AWS RDS or DynamoDB
- Experience and solid understanding of various software design patterns.
- Experience preparing specifications from which programs will be written, designed, coded, tested and debugged.
- Strong organizational skills.
Basic experience/understanding of AWS Components (in order of importance):
- Glue
- CloudWatch
Responsibilities:
ROLE:
- Data Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms.
- Data Engineers are custodians of data and must ensure that data is shared in line with the information classification requirements on a need-to-know basis.
REQUIREMENT SUMMARY
Min:6.0Max:8.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Business, Engineering, IT
Proficient
1
Midrand, Gauteng, South Africa