Data Engineer at Ford Global Career Site
Allen Park, Michigan, United States -
Full Time


Start Date

Immediate

Expiry Date

06 Feb, 26

Salary

0.0

Posted On

08 Nov, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

SQL, Python, BigQuery, Google Dataflow, ETL, Data Pipelines, Cloud Data Platforms, CI/CD, Docker, Data APIs, Data Quality, Hadoop, SQL Server, Pub/Sub, Data Integration, Data Transformation

Industry

Motor Vehicle Manufacturing

Description
Design, develop and maintain EL/ELT/ETL pipelines that load data into BigQuery from batch and streaming sources. Migrate and replace legacy integration services (e.g., IICS) with cloud native templates (Google Dataflow) and reusable integration patterns. Work with on prem (Hadoop, SQL Server) and cloud data sources to understand schemas and business rules and implement reliable transformations. Build cloud native services and data APIs to expose data products to internal and external consumers. Partner with data scientists, BI engineers, and product owners to deliver the right data on the right cadence. Implement monitoring, alerting, and optimization for pipelines and services; manage operational SLAs. Define and enforce data quality checks, lineage, and metadata practices; produce runbooks and operational documentation. Contribute to architecture decisions, CI/CD pipelines, and reusable templates for the Pro360 platform. Provide technical guidance to stakeholders and participate in cross team integration, testing, and deployment activities. Established and active employee resource groups Bachelor's degree (or equivalent experience). 3+ years of experience with SQL and Python. 2+ years working with cloud data platforms (GCP or AWS); strong candidates with 5+ years in traditional DW/ETL environments will be considered. 3+ years building scalable, distributed, fault tolerant data pipelines from scratch. Experience with relational and non relational databases and familiarity with BigQuery and cloud dataflow/streaming concepts. Proven track record of delivering production data applications and APIs, and of collaborating with analytics and product teams. Strong communication skills and ability to translate business requirements into technical solutions. Hands on experience with Google Dataflow, Pub/Sub, BigQuery, and associated GCP services. Experience with CI/CD for data pipelines, containerization (Docker), and infrastructure as code. Familiarity with Salesforce integrations, data mastering/identity resolution, or data quality frameworks. GCP certification or other cloud/data engineering certification.
Responsibilities
Design, develop and maintain EL/ELT/ETL pipelines for data loading into BigQuery. Collaborate with data scientists and BI engineers to ensure reliable data delivery and implement monitoring and optimization for data services.
Loading...