Log in with
Don't have an account? Create an account
Need some help?
Talk to us at +91 7670800001
Log in with
Don't have an account? Create an account
Need some help?
Talk to us at +91 7670800001
Please enter the 4 digit OTP has been sent to your registered email
Sign up with
Already have an account? Log in here
Need some help?
Talk to us at +91 7670800001
Jobs Search
Start Date
Immediate
Expiry Date
06 May, 25
Salary
0.0
Posted On
06 Feb, 25
Experience
0 year(s) or above
Remote Job
No
Telecommute
No
Sponsor Visa
No
Skills
Good communication skills
Industry
Information Technology/IT
WE ARE LOOKING FOR YOU, IF YOU HAVE:
hands-on experience building complex data pipelines,
experience with Apache Airflow, Spark, Python,
experience in setting up and optimizing both SQL and noSQL data stores (MS-SQL, Hive), as well as familiarity with object storage services (e.g., S3),
experience with deployment and provisioning automation tools (e.g., Docker, Kubernetes, CI/CD).
work with the latest concepts and technologies in the field of Data Engineering,
as an engineer, you will be responsible to design and build the data pipelines for one of many use cases catering various stakeholder requirements,
your day-to-day activities can look like this: implementing new features in Kedro/PySpark, testing them in Apache Airflow, deploying it in our in-house platform running on Kubernetes, as well as optimizing the data in the permanent data store.