Data Engineer II at Microsoft
Redmond, Washington, United States -
Full Time


Start Date

Immediate

Expiry Date

20 Feb, 26

Salary

0.0

Posted On

22 Nov, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, APIs, DevOps, DataOps, Security, Networking, Automation, Collaboration, Data Governance, Data Compliance, Data Security, Data Platforms, Backend Services, Containerization, Lakehouse, Warehouse Paradigms, BI Enablement

Industry

Software Development

Description
- Develop services and APIs that expose curated datasets and features to downstream applications and ML/analytics scenarios; contribute to backend components, testing, and performance tuning. - Harden security and networking for data workloads (e.g., private endpoints, VNet/VPN integration, rolebased access, key management), partnering with security and networking teams. - Automate and productize: apply DevOps/DataOps (GitHub/Azure DevOps, CI/CD, IaC such as Bicep/Terraform) to deliver reliable releases, repeatable environments, and costefficient operations. - Collaborate & document: write design docs, review code, and work crossfunctionally with PMs, data scientists, and stakeholders. - Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 1+ year(s) experience in business analytics, data science, software development, data modeling, or data engineering - OR Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years experience in business analytics, data science, software development, data modeling, or data engineering - OR equivalent experience. - 1+ year(s) experience with data governance, data compliance and/or data security. - 3-5+ years building and operating data platforms/services at scale (batch + streaming), including cost management and observability (e.g., Azure Monitor/Log Analytics/Azure Data Explorer). - Experience developing backend services/APIs (REST/gRPC), eventdriven patterns, and containerization (Docker/Kubernetes). - Hands on experience with lakehouse (e.g., Delta/Parquet) and warehouse paradigms; dimensional/semantic modeling; BI enablement (e.g., Power BI/Fabric DW). - IaC (Bicep/Terraform), pipeline orchestration (Fabric Data Factory/Synapse/ADF), testing frameworks (unit/integration/data quality).
Responsibilities
Develop services and APIs that expose curated datasets and features to downstream applications. Harden security and networking for data workloads while automating and productizing processes.
Loading...