Senior Data Engineer - Sales Ops at KimberlyClark
Buenos Aires, Buenos Aires, Argentina -
Full Time


Start Date

Immediate

Expiry Date

10 May, 25

Salary

0.0

Posted On

11 Feb, 25

Experience

3 year(s) or above

Remote Job

No

Telecommute

No

Sponsor Visa

No

Skills

Relational Databases, Data Exchange, Data Warehouse, Mysql, Databases, Programming Languages, Performance Tuning, Graph Databases, Scala, It, Reporting, Data Privacy, Mongodb, Scripting, Data Warehousing, Kafka, Integration, Python, Data Systems, Analytics, Sql

Industry

Information Technology/IT

Description

YOUR JOB

You’re not the person who will settle for just any role. Neither are we. Because we’re out to create Better Care for a Better World, and that takes a certain kind of person and teams who care about making a difference. Here, you’ll bring your professional expertise, talent, and drive to building and managing our portfolio of iconic, ground-breaking brands. In your role, you’ll help us deliver better care for billions of people around the world. It starts with YOU.

ABOUT US

Huggies®. Kleenex®. Cottonelle®. Scott®. Kotex®. Poise®. Depend®. Kimberly-Clark Professional®. You already know our legendary brands—and so does the rest of the world. In fact, millions of people use Kimberly-Clark products every day. We know these amazing Kimberly-Clark products wouldn’t exist without talented professionals, like you.
At Kimberly-Clark, you’ll be part of the best team committed to driving innovation, growth and impact. We’re founded on 150 years of market leadership, and we’re always looking for new and better ways to perform – so there’s your open door of opportunity. It’s all here for you at Kimberly-Clark; you just need to log on!
Led by Purpose. Driven by You.

KEY QUALIFICATIONS AND EXPERIENCES:

  • 5+ years of experience tailoring, configuring, and crafting solutions within the Snowflake environment, including a profound grasp of Snowflake’s data warehousing capabilities, data architecture, SQL optimization for Snowflake, and leveraging Snowflake’s unique features such as Snowpipe, Streams, and Tasks for real-time data processing and analytics. A strong foundation in data migration strategies, performance tuning, and securing data within the Snowflake ecosystem is essential.
  • 3+ years demonstrated expertise in architecting solutions within the Snowflake ecosystem, adhering to best practices in data architecture and designs.
  • 10+ years of data engineering or design experience, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting.
  • Expertise in data modeling principles/methods including, Conceptual, Logical & Physical Data Models for data warehouses, data lakes and/or database management systems.
  • 5+ years of hands-on experience designing, building, and operationalizing data solutions and applications using cloud data and analytics services in combination with 3rd parties.
  • 10+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols).
  • 10+ years of experience with database development and scripting.
  • Deep understanding of data architecture, data engineering, data warehousing, data analysis, reporting, and data science techniques and workflows: You should have a comprehensive knowledge of designing and implementing data systems that support various analytic and operational use cases, including data storage, processing, and retrieval.
  • Skilled in creating data products that support analytic solutions: Proficiency in developing data products that enable stakeholders to derive meaningful insights and make data-driven decisions. This involves creating datasets, data models, and data services tailored to specific business needs.
  • Proficiency in working with APIs and understanding data structures to serve them: Experience in designing, developing, and consuming APIs for data access and integration. This includes understanding various data structures and formats used in API communication.
  • Experience with Object-Relational Mapping (ORM) frameworks: Familiarity with ORM frameworks, such as Hibernate or Entity Framework, to efficiently map data between relational databases and application code.
  • Knowledge of managing sensitive data, ensuring data privacy and security: Expertise in handling sensitive data with strict adherence to data privacy regulations and security best practices to protect against unauthorized access and breaches.
  • Expertise in data visualization tools, specifically PowerBI: Proficiency in using data visualization tools like PowerBI to create interactive and insightful dashboards and reports that effectively communicate complex data insights.
  • Strong problem-solving skills and ability to work as part of a technical, cross-functional analytics team: Excellent analytical and troubleshooting abilities, with the capability to collaborate effectively with team members from various technical and business domains.
  • Experience with relational and non-relational databases (NoSQL, graph databases, etc.): Solid experience in working with different types of databases, including traditional relational databases (e.g., SQL Server, MySQL) and non-relational databases (e.g., MongoDB, graph databases).
  • Agile learner with a passion for solving complex data problems and delivering insights: A proactive and continuous learner with enthusiasm for addressing challenging data issues and providing valuable insights through innovative solutions.
  • Proficiency in programming languages such as SQL, NoSQL, Python, Java, R, and Scala: Strong coding skills in multiple programming languages used for data manipulation, analysis, and pipeline development.
  • Familiarity with relational and non-relational databases, including GraphQL and MongoDB: In-depth understanding of both relational (SQL-based) and non-relational (NoSQL) databases, with specific experience in technologies like GraphQL and MongoDB.
  • Experience with ETL (extract, transform, and load) systems and API integrations: Expertise in building and maintaining ETL processes to consolidate data from various sources into centralized repositories, and integrating APIs for seamless data exchange.
  • Understanding of data storage solutions, knowing when to use a data lake versus a data warehouse: Knowledge of different data storage architectures and the ability to choose the appropriate solution (data lake or data warehouse) based on specific use cases and data characteristics.
  • Ability to write scripts for automation and repetitive task management: Proficiency in scripting languages (e.g., Python, Bash) to automate data processing tasks and reduce manual efforts.
  • Basic understanding of machine learning concepts to support data scientists on the team: Familiarity with key machine learning principles and techniques to better collaborate with data scientists and support their analytical models.
  • Proficiency with big data tools such as Hadoop, MongoDB, and Kafka: Experience in using big data technologies to manage, process, and analyze large datasets efficiently.
  • Knowledge of cloud computing, including cloud storage, product portfolios, and pricing models (Azure): Understanding of cloud platforms and services, particularly Azure, including storage options, available tools, and cost considerations.
  • Experience in data security, ensuring data is securely managed and stored to protect it from loss or theft while maintaining compliance: Strong background in implementing security measures to safeguard data and comply with regulatory requirements.
  • Bachelor’s degree in management information systems/technology, Computer Science, Engineering, or related discipline. MBA or equivalent is preferred.
  • Minimum 10+ years of experience in designing large-scale data solutions, performing design assessments, crafting design options and analysis, finalizing preferred solution choice working with IT and Business stakeholders.
Responsibilities
  • Design and operationalize enterprise data solutions on Cloud Platforms: Develop and implement scalable and secure data solutions on cloud platforms, ensuring they meet enterprise standards and requirements. This includes designing data architecture, selecting appropriate cloud services, and optimizing performance for data processing and storage.
  • Integrate Azure services, Snowflake technology, and other third-party data technologies: Seamlessly integrate various data technologies, including Azure services, Snowflake, and other third-party tools, to create a cohesive data ecosystem. This involves configuring data connectors, ensuring data flow consistency, and managing dependencies between different systems.
  • Build and maintain high-quality data pipelines for analytic solutions: Develop robust data pipelines that automate the extraction, transformation, and loading (ETL) of data from various sources into a centralized data warehouse or lake. Ensure these pipelines are efficient, reliable, and capable of handling large volumes of data.
  • Collaborate with a multidisciplinary agile team to generate insights from connected data Work closely with data scientists, analysts, and other team members in an agile environment to translate business requirements into technical solutions. Participate in sprint planning, stand-ups, and retrospectives to ensure timely delivery of data products.
  • Manage and create data inventories for analytics and APIs to be consumed: Develop and maintain comprehensive data inventories that catalog available data assets and their metadata. Ensure these inventories are accessible and usable by various stakeholders, including through APIs that facilitate data consumption.
  • Design data integrations with internal and external products: Architect and implement data integration solutions that enable seamless data exchange between internal systems and external partners or products. This includes ensuring data integrity, security, and compliance with relevant standards.
  • Build data visualizations to support analytic insights: Create intuitive and insightful data visualizations using tools like PowerBI to help stakeholders understand complex data sets and derive actionable insights. This involves designing dashboards, reports, and interactive visualizations that effectively communicate key metrics and trends.
Loading...