Cloud Data Engineer at NTT DATA
Nashville, Tennessee, United States -
Full Time


Start Date

Immediate

Expiry Date

20 Jan, 26

Salary

0.0

Posted On

22 Oct, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, ETL, Databricks, Snowflake, SQL, Python, SDLC, Data Analysis, Data Pipelines, Cloud Services, Data Warehousing, Data Integration, Data Security, GitHub, Learning Agility, Adaptability

Industry

IT Services and IT Consulting

Description
Data Engineering: Utilize your ETL/Data Engineering expertise in Databricks, Snowflake and Cloud data services to build and maintain robust data solutions. Lead the evaluation of various catalogs and query engines for the Data Lakehouse platform, documenting findings and reviewing them with the architecture teams. SQL, Python, and strong knowledge of the SDLC are required. Build and manage dozens of data pipelines to source and transform data based on business requirements. Financial Data Analysis: Apply your knowledge in financial data analysis, risk, and compliance data management to support our financial services customers. Data Analysis and Discovery: Leverage Databricks and Snowflake for data analysis and discovery, ensuring data is accessible and actionable. Leverage 10+ sources of data to derive insights. Innovation and Learning: Quickly learn new technologies by applying your current skills, staying ahead of industry trends and advancements. Self-identify the need for new skills to be developed and adopt new technologies into your skill set in a month's time. Client Collaboration: Work closely with financial services clients to build modern data solutions that transform how they leverage data for key business decisions, investment portfolio performance analysis, and risk and compliance management. Manage multiple stakeholder groups and their requirements. Team Collaboration: Collaborate within a Pod of 4+ data engineers, working towards common objectives in a consultative fashion with clients. Data Movement and Transformation: Use Cloud native ETL services (i.e., AWS Glue or Snaplogic) and Python/SQL for data movement, streaming, and transformation services, ensuring efficient and reliable data workflows. Industry Leadership: Work with a client that is leading the industry in using data to drive business decision optimization and investment management strategies. Experience: At least 5 years of experience in data engineering. Technical Skills: Proficiency in Snowflake and Cloud data services. 5+ years of experience working with cloud services. 2+ years of experience working with Databricks or Snowflake. 3+ years working with PySpark. Proficiency in the following areas: Data Warehousing: Strong knowledge of data warehousing concepts. Python or Java: Advanced skills in Python or Java programming for data engineering and data pipelines. Data Integration: Proficiency in integrating data from various sources. Data Platforms: Strong knowledge of data modeling, storage and design within Snowflake. Data Security: Experience in securing data based on role and policy definitions. Data Pipelines: Experience in building and managing data pipelines. Cloud Experience: Experience AWS, Azure, or GCP data services. GitHub: Proficiency in using GitHub for version control. Preferred Technical skills: Apache Iceberg, Databricks, Dremio Domain Expertise: Nice to have experience in financial data analysis, risk, and compliance data management. Learning Agility: Ability to quickly learn new technologies by applying current skills. Operate within a sprint model. Stories are assigned, developed, validated, and peer reviewed. Each story typically involves: Writing and executing scripts. Validating results and capturing observations Documenting outcomes in Confluence Conducting peer reviews and merging code into Git branches Architect Reviews: Present deliverables and findings to the architecture team for feedback and alignment. Adaptability: Ability to adapt to new technologies quickly and efficiently.
Responsibilities
Utilize ETL/Data Engineering expertise to build and maintain robust data solutions while collaborating with clients to transform data for key business decisions. Lead the evaluation of data catalogs and query engines, ensuring data is accessible and actionable for financial services customers.
Loading...