Software Engineer II, Data
at GITHUB INC
Remote, Scotland, United Kingdom -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 29 Nov, 2024 | GBP 44800 Annual | 29 Aug, 2024 | N/A | Sql,Git,Collaboration,Agile Environment,Computer Science,Business Requirements,Software Coding,Physics,Python,Team Culture,Data Models,Languages,Github,Airflow,Computer Engineering | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
About GitHub: As the global home for all developers, GitHub is the complete AI-powered developer platform to build, scale, and deliver secure software. Over 100 million people, including developers from 90 of the Fortune 100 companies, use GitHub to build amazing things together across 330+ million repositories. With all the collaborative features of GitHub, it has never been easier for individuals and teams to write faster, better code.
Locations: In this role you can work from Remote, United Kingdom
Overview:
As a Data Engineer on the Data Science team you will be responsible for designing, developing, and maintaining efficient and reliable data pipelines that power digital experiences for our customers across different surfaces. You will work closely with stakeholders across the company (Marketing, Revenue, Product) to gather business requirements, instrument telemetry, build data models, and ensure data quality and accessibility. Your expertise in Python, SQL, KQL and Airflow will be crucial to evolve our data infrastructure and integrate different systems.
We are the Data Science team at GitHub, helping all decision makers across the company, whether it’s collaborating with Product to understand the performance of a newly launched offering, helping Marketing with data-driven lifecycle campaigns, or assisting Revenue in finding growth opportunities. We build and ship data products, and we make an impact. If you’re the kind of person who gets a thrill from seeing your work go live and making a difference, you’ll fit right in. We believe in learning and growing together. When one of us wins, we all win. So, expect a lot of sharing, teaching, and, of course, plenty of high-fives (sparkles).
Responsibilities:
- Design, build, and maintain scalable data pipelines using Python, SQL, KQL and Airflow
- Work with business stakeholders and other team members (data analysts and data scientists) to understand business requirements and translate them into technical specifications.
- Develop and implement data models that support automation, analytics and reporting needs.
- Ensure data accuracy, consistency, and reliability by implementing robust data validation and quality checks.
- Build and implement monitoring and observability metrics to detect anomalies in data pipelines.
- Own and advocate for the health and quality of the data pipelines that the team builds, including participating in on-call and first responder rotations.
Qualifications:
REQUIRED QUALIFICATIONS:
- A couple years of experience in Software Engineering, Computer Science, or related technical discipline with proven experience maintaining production software coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, Go, Ruby, Rust, or Python
- OR Associate’s Degree in Computer Science, Electrical Engineering, Electronics Engineering, Math, Physics, Computer Engineering, Computer Science, or related field AND equivalent experience
- OR Bachelor’s Degree in Computer Science or related field
- OR equivalent experience
- Some experience in a data engineering or analytics engineering role
PREFERRED QUALIFICATIONS:
- Ability to gather business requirements and translate them into long lasting data models
- Proficiency in Python, SQL and Airflow
- Experience using Azure technologies (Kusto/KQL) is a bonus
- Strong written and verbal communication skill
- Passionate about healthy team culture and collaboration
- Comfortable working transparently in an agile environment and soliciting feedback from peers
- Experience with Git, GitHub and remote working is a plus
GitHub Leadership Principles:
Responsibilities:
- Design, build, and maintain scalable data pipelines using Python, SQL, KQL and Airflow
- Work with business stakeholders and other team members (data analysts and data scientists) to understand business requirements and translate them into technical specifications.
- Develop and implement data models that support automation, analytics and reporting needs.
- Ensure data accuracy, consistency, and reliability by implementing robust data validation and quality checks.
- Build and implement monitoring and observability metrics to detect anomalies in data pipelines.
- Own and advocate for the health and quality of the data pipelines that the team builds, including participating in on-call and first responder rotations
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - System Programming
Software Engineering
Graduate
Computer Science, Electrical, Electrical Engineering, Engineering, Math
Proficient
1
Remote, United Kingdom