BigQuery Developer / GCP Data Engineer at Tryton TC LLC
Woonsocket, Rhode Island, United States -
Full Time


Start Date

Immediate

Expiry Date

19 May, 26

Salary

0.0

Posted On

18 Feb, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

BigQuery, SQL, Python, Dataflow, Cloud Composer, GitHub, CI/CD, Data Warehousing, Data Modeling, ETL/ELT, Apache Beam, Apache Airflow, GitHub Actions, Tidal Job Scheduler, Analytical Skills, Problem-Solving

Industry

Insurance

Description
Description Role Type: Contract position. Engagement options available on W2 or C2C basis. Location: Primarily remote. Occasional onsite presence may be required based on project needs. Preference will be given to candidates within driving distance of Woonsocket, RI. Job Title: BigQuery Developer / GCP Data Engineer Job Summary: We are looking for a motivated and detail-oriented BigQuery Developer with hands-on experience in Google Cloud Platform to support and enhance our enterprise data warehouse and analytics solutions. The ideal candidate will have strong SQL , BigQuery and Python development experience, along with working knowledge of Dataflow, Cloud Composer, GitHub, and CI/CD practices. This role requires strong analytical skills, problem-solving ability, and effective communication to work with cross-functional teams. Key Responsibilities · BigQuery Development (Primary Focus) · Develop, maintain, and optimize BigQuery datasets, tables, views, procedures and queries. · Write efficient and scalable SQL for reporting and analytics. · Implement partitioning and clustering to improve query performance. · Support data warehouse design and data modeling activities. · Monitor query performance and optimize cost usage. · Troubleshoot and resolve data-related issues in BigQuery. · Support data validation and quality checks. Data Pipeline Development · Develop and maintain batch pipelines using Python and Google Cloud Dataflow (Apache Beam). · Load, transform, and integrate data from various sources into BigQuery. · Work on ETL/ELT processes and ensure reliable data processing. · Assist in debugging and performance tuning of pipelines. Workflow Orchestration · Develop and maintain workflows using Cloud Composer (Apache Airflow). · Integrate workflows with Tidal Job Scheduler for enterprise scheduling. · Monitor production jobs and support issue resolution. Version Control & CI/CD · Use GitHub for source control and collaboration. · Contribute to CI/CD pipelines using GitHub Actions. · Follow best practices for code versioning and peer reviews. Collaboration & Communication · Work closely with data analysts, business users, and technical teams. · Translate business requirements into efficient BigQuery solutions. · Document data flows, technical designs, and operational processes. · Provide production support as needed. Requirements · 3–5 years of experience in data engineering or data development. · Strong hands-on experience with BigQuery. · Strong SQL skills (joins, aggregations, window functions, performance tuning). · Experience with Google Cloud Platform (GCP). · Experience building batch pipelines using Python and Dataflow. · Experience with Cloud Composer (Airflow). · Working knowledge of GitHub and GitHub Actions. · Experience with enterprise job schedulers such as Tidal. · Understanding of data warehousing concepts. · Strong analytical and problem-solving skills. · Good verbal and written communication skills.
Responsibilities
The primary focus is BigQuery development, involving creating, maintaining, and optimizing datasets, views, and scalable SQL queries for reporting and analytics. This role also includes developing batch data pipelines using Python and Dataflow, orchestrating workflows with Cloud Composer, and managing version control via GitHub.
Loading...