Data Engineer Sr (AWS) Hybrid GDL at NTT DATA
Guadalajara, jalisco, Mexico -
Full Time


Start Date

Immediate

Expiry Date

14 Apr, 26

Salary

0.0

Posted On

14 Jan, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

ETL, Data Pipelines, SQL, Relational Databases, Data Governance, Cloud Platforms, AWS, Data Modeling, Data Warehousing, Programming, Python, Java, Scala, Data Integration, Machine Learning, Agile

Industry

IT Services and IT Consulting

Description
Responsibilities include design and implement ETL (Extract, Transform, Load) processes to clean and prepare data for analysis. Building scalable data pipelines to ensure smooth data flow and accessibility. Working with data scientists, analysts, and other stakeholders to understand data needs and provide solutions. Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Ensure data quality, consistency and governance across multiple sources. 6 - 8 years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Experience with data pipeline tools (e.g., Apache Airflow, Luigi, Prefect). Proficiency with at least one programming language (Python, Java, or Scala). Experience with cloud platforms (AWS) and their data services (e.g., Redshift, BigQuery, Snowflake, Databricks). Familiarity with data modeling, warehousing, and schema design. Understanding of data governance, privacy, and security best practices. Exhibit a strong understanding of Data integration technologies Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Exposure to machine learning data pipelines. LI-LATAM
Responsibilities
Design and implement ETL processes and scalable data pipelines for data analysis. Collaborate with stakeholders to provide tailored data solutions and ensure data quality and governance.
Loading...