Start Date
Immediate
Expiry Date
11 Sep, 25
Salary
170000.0
Posted On
12 Jun, 25
Experience
4 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Airflow, Deployment Strategies, Continuous Integration, Oozie, Hubs, Talend, Modeling
Industry
Information Technology/IT
Cargill’s size and scale allows us to make a positive impact in the world. Our purpose is to nourish the world in a safe, responsible and sustainable way.
Cargill is a family company providing food, ingredients, agricultural solutions and industrial products that are vital for living. We connect farmers with markets so they can prosper. We connect customers with ingredients so they can make meals people love. And we connect families with daily essentials — from eggs to edible oils, salt to skincare, feed to alternative fuel. Our 160,000 colleagues, operating in 70 countries, make essential products that touch billions of lives each day. Join us and reach your higher purpose at Cargill.
This position is in Cargill’s food and bioindustrial business, where manufacturers, retailers, and foodservice companies rely on us to consistently deliver the products and services they need, and use our technical expertise and market knowledge to develop innovative products. Job Summary
The Senior Professional, Data Engineering job designs, builds and maintains complex data systems that enable data analysis and reporting. With minimal supervision, this job ensures that large sets of data are efficiently processed and made accessible for decision making. Essential Functions
DATA INFRASTRUCTURE: Prepares data infrastructure to support the efficient storage and retrieval of data.
DATA FORMATS: Examines and resolves appropriate data formats to improve data usability and accessibility across the organization.
DATA & ANALYTICAL SOLUTIONS: Develops complex data products and solutions using advanced engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.
DATA PIPELINES: Develops and maintains streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
DATA SYSTEMS: Reviews existing data systems and architectures to identify areas for improvement and optimization.
STAKEHOLDER MANAGEMENT: Collaborates with multi-functional data and advanced analytic teams to gain requirements and ensure that data solutions meet the functional and non-functional needs of various partners.
DATA FRAMEWORKS: Builds complex prototypes to test new concepts and implements data engineering frameworks and architectures that improve data processing capabilities and support advanced analytics initiatives.
AUTOMATED DEPLOYMENT PIPELINES: Develops automated deployment pipelines improving efficiency of code deployments with fit for purpose governance.
DATA MODELING: Performs complex data modeling in accordance to the datastore technology to ensure sustainable performance and accessibility. Minimum and Typical Years of Experience
Minimum requirement of 4 years of relevant work experience. Typically reflects 5 years or more of relevant experience.
PREFERRED EXPERIENCE:
Experience developing modern data architectures, such as data warehouses, data lakes, data meshes, hubs and associated capabilities including ingestion, governance, modeling, observability and more
Experience with transformation and modeling tools, including SQL based transformation frameworks, orchestration and quality frameworks such as dbt, Apache Nifi, Talend, AWS Glue, Airflow, Dagster, Great Expectations, Oozie and others
Strong experience of working in devops models with demonstratable understanding of associated best practices for code management, continuous integration, and deployment strategies
Please refer the Job description for details