Data Engineer Advisor at NTT DATA
Montreal, Quebec, Canada -
Full Time


Start Date

Immediate

Expiry Date

15 Apr, 26

Salary

0.0

Posted On

15 Jan, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Python, SQL, Data Pipelines, Data Warehousing, Performance Tuning, Agile Practices, Data Validation, Cloud Platforms, Data Lakes, Analytics, Data Integration, Data Solutions, Collaboration, Problem Solving, E-R Data Models, Testing Frameworks

Industry

IT Services and IT Consulting

Description
Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. Demonstrate proficiency in coding skills, utilizing languages - Python & SQL, to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. Collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc. Develop various components in Python of our unified data pipeline framework. Contribute towards the establishment of best practices for the optimal and efficient usage of data across various on-prem and cloud platforms. Assist with the testing and deployment of our data pipeline framework utilizing standard testing frameworks and CI/CD tooling. Monitor the performance of queries and data loads and perform tuning as necessary. Provide assistance and guidance during the QA & UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues. Develop SQL-based data validation, reporting, analysis, etc. as required Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. Adhere to Agile practices throughout the solution development process. Design, build, and deploy databases and data stores to support organizational requirements. 4+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 4+ years of experience in data development and solutions in highly complex data environments with large data volumes. 3+ years of SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. 3+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 2+ years of experience developing solutions in a hybrid data environment (on-Prem and Cloud) Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files, etc.) Experience with performance tuning SQL queries, Spark job, and stored procedures. An understanding of E-R data models (conceptual, logical, and physical). Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels. Self-starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities. Knowledge of regulatory requirements in the financial industry Education: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). Minimum Skills Required: * 5+ years of experience"
Responsibilities
Design and implement data solutions tailored to customer needs, covering various technical stacks. Collaborate across teams to establish best practices and monitor data performance.
Loading...