SR. GCP DATA ENGINEER, SMAI at Micron Technology
Hyderabad, Telangana, India -
Full Time


Start Date

Immediate

Expiry Date

12 Feb, 26

Salary

0.0

Posted On

14 Nov, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, ETL Processes, GCP, Big Query, Snowflake, SQL, NoSQL, Apache Ni-Fi, Data Management Systems, Data Quality, Data Integrity, Data Structures, Data Analysis, Problem Solving, Communication Skills, Software Development

Industry

Semiconductor Manufacturing

Description
Understand the Business Problem and the Relevant Data Maintain and understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting options Architect Data Management Systems Design and implement optimum data structures in the appropriate data management system (GCP BQ, Snowflake, SQL server etc.) to satisfy the data requirements Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data Develop processes to efficiently load the transformed data into the data management system Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) Develop and code data extracts Follow standard methodologies to ensure data quality and data integrity Ensure that the data is fit to use for data science applications 5-8 years of experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems and generic tools. Experienced in developing ETL/ELT processes using Apache Ni-Fi, Cloud solutions like GCP, Big Query and Snowflake or any equivalent etc. Significant experience with big data processing and/or developing applications and data sources using different cloud services etc. Experienced in integration with different Ingestion, Scheduling, logging, Alerting and Monitoring cloud services. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Cloud DBs like BQ, Snowflake, AlloyDB or any equivalents etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, multi-functional environment Minimum of 2 years' experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.) , with any ETL Tool experience (SSIS, Informatica etc.) Software development Ability to travel as needed B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S.
Responsibilities
The role involves understanding business problems and relevant data, translating analysis requirements into data requirements, and developing data management systems. The engineer will also automate ETL processes and ensure data quality for data science applications.
Loading...