Data Engineer
at Gathern
Riyadh, منطقة الرياض, Saudi Arabia -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 16 Dec, 2024 | Not Specified | 18 Sep, 2024 | 3 year(s) or above | Relational Databases,Aws,Information Technology,Programming Languages,Airflow,Talend,Data Modeling,Kubernetes,Azure,Communication Skills,Statistics,Google Cloud Platform,Sql,Data Engineering,Docker,Snowflake,Etl Tools,Computer Science,Kafka | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
- Develop and maintain scalable ETL (Extract, Transform, Load) pipelines that ingest, clean, and structure data from various sources (internal and external).
- Data Infrastructure: Design, implement, and maintain cloud-based data storage solutions (e.g., GCP, AWS, Azure,
- Snowflake, etc.), ensuring data availability, reliability, and scalability.
- Collaborate with data governance teams to ensure that all data adheres to quality standards, security protocols, and
- compliance with relevant regulations (e.g., GDPR, CCPA).
- Continuously monitor and optimize data pipelines, database performance, and storage solutions to ensure efficient data
- processing and access.
- Work closely with data scientists, analysts, and business intelligence teams to understand data requirements and provide reliable data solutions.
- Automate recurring data tasks, such as data cleaning, transformation, and loading, to ensure the system operates
- smoothly and with minimal manual intervention.
- Design and implement database schemas, dimensional models, and data lakes to support analytics and business
- intelligence efforts.
- Create and maintain detailed documentation for data pipelines, infrastructure, and workflows to ensure transparency and
- ease of collaboration.
- Implement and maintain data security measures such as role-based access control (RBAC), encryption, and backups to
- protect sensitive data.
- Set up monitoring tools and proactively identify and resolve data-related issues (performance bottlenecks, failures, etc.).
REQUIREMENTS
- Bachelor’s degree in a field such as computer science, information technology, statistics, or mathematics.
- A minimum of 3 years of experience in data engineering or a related field.
- Strong experience with SQL and relational databases (e.g., PostgreSQL, MySQL).
- Understanding of data modeling, data lakes, and data warehouses.
- Knowledge of data warehousing solutions such as BigQuery, Snowflake, or Redshift.
- Proficiency with cloud platforms like AWS, or Azure , Google Cloud Platform (GCP) as preferred .
- Proficiency in programming languages such as Python.
- Experience with big data technologies such as Apache Hadoop, Spark, Kafka, etc..
- Familiarity in ETL tools like Airflow, Talend, or Apache NiFi.
- Familiarity with containerization and orchestration tools like Docker and Kubernetes.
- Ability to analyze and optimize large, complex datasets. Aware of Business acumen and objectives oriented.
- Strong problem-solving skills and ability to troubleshoot complex data and infrastructure issues.
- Excellent communication skills to work effectively with cross-functional teams (e.g., analytics, product, marketing and
growth).
- Familiarity with CI/CD practices for data pipelines.
How To Apply:
Incase you would like to apply to this job directly from the source, please click here
Responsibilities:
Please refer the Job description for details
REQUIREMENT SUMMARY
Min:3.0Max:8.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
A field such as computer science information technology statistics or mathematics
Proficient
1
Riyadh, Saudi Arabia