Contractor: Senior-level Backend Software Engineering Services (Reporting) at Newsela
, , Mexico -
Full Time


Start Date

Immediate

Expiry Date

04 May, 26

Salary

0.0

Posted On

03 Feb, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

SQL, Python, Data Modeling, Data Transformation, Data Testing, Data Quality, Collaboration, Communication, Automation, Data Infrastructure, API Development, DBT, GCP, AWS, Terraform, OLAP Datastores

Industry

E-Learning Providers

Description
Seeking to hire a Contractor based out of Argentina or Mexico for Senior-Level Back End Engineering Services (Reporting) Scope of Services: We are looking for an Analytics Reporting Engineer to join our data team. Reporting to the Hiring Manager, you will be responsible for applying best practices in data modeling, data transformation, data testing for serving quality data to customer reporting interfaces. You will be responsible for building and maintaining composable data models, optimizing SQL query performance and maintaining python APIs for serving data to customers. You will transform raw data into business insights, working closely with stakeholders to serve data to critical customer reporting interfaces. Why you'll love this role:Data Modeling and Transformation Build new models and optimize existing ones using SQL and Python. Apply software engineering principles like version control and continuous integration to the analytics codebase. Expand our data warehouse with clean data ready for analysis. Data Quality and Testing Apply advanced data testing strategies to ensure resulting datastores are aligned with expected business logic. Implement validation checks and automated testing procedures to manage data quality in data transformation and API layers. Collaboration and Communication Work with stakeholders to define business logic and data expectations. Help drive a change in the usage of data by actively surfacing insights to stakeholders. Lead initiatives and problem definition, scoping, design, and planning. Infrastructure and Automation Build tools and automation to run data infrastructure. Manage large-scale data migrations in relational datastores. Why you’re a great fit: 5+ Years experience working with data in a software environment. Required Skills: Mastered proficiency in SQL and Python; advanced experience managing business semantic layer tooling, data catalog tooling and data integrity testing frameworks. Experience with DBT implementation and best practices. Experience with Python API development is a plus, but not required. Experience with GCP services is preferred but not required. You have a track record of working autonomously, with developing a deep domain knowledge of data systems. Required Tech Stack: SQL, Python, OLAP datastores, DAG tooling (like Dagster or Airflow), and DBT Experience with cloud-based infrastructure (AWS, GCP, Terraform) and document, graph, or schema-less datastores. Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits. #LI-Remote
Responsibilities
The role involves applying best practices in data modeling, transformation, and testing to ensure quality data for customer reporting interfaces. You will also build and maintain data models, optimize SQL queries, and collaborate with stakeholders to derive business insights.
Loading...