Data Engineer (Hybrid-Gdl) at NTT DATA
Guadalajara, jalisco, Mexico -
Full Time


Start Date

Immediate

Expiry Date

14 Jan, 26

Salary

0.0

Posted On

16 Oct, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Data Analytics, SQL, Relational Databases, Data Pipeline Tools, Python, Java, Scala, Cloud Platforms, Data Services, Data Modeling, Data Warehousing, Data Governance, Data Integration, Machine Learning

Industry

IT Services and IT Consulting

Description
Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks Develop and deliver detailed presentations to effectively communicate complex technical concepts. Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. Ensure data quality, consistency and governance across multiple sources. 6 - 8 years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Experience with data pipeline tools (e.g., Apache Airflow, Luigi, Prefect). Proficiency with at least one programming language (Python, Java, or Scala). Experience with cloud platforms (AWS, Azure, or GCP) and their data services (e.g., Redshift, BigQuery, Snowflake, Databricks). Familiarity with data modeling, warehousing, and schema design. Understanding of data governance, privacy, and security best practices. Exhibit a strong understanding of Data integration technologies Showcase professional written and verbal communication skills to effectively convey complex technical concepts. Exposure to machine learning data pipelines. LI-LATAM
Responsibilities
Design and implement tailored data solutions to meet customer needs, covering a wide range of data technologies. Collaborate across diverse technical stacks and ensure data quality and governance.
Loading...