Senior Data Analyst at EPAM Systems Inc
Desde casa, , Mexico -
Full Time


Start Date

Immediate

Expiry Date

29 Sep, 25

Salary

0.0

Posted On

30 Jun, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

B2, Data Engineering, Scheduling Tools, Data Science, Tableau, Airflow, Snowflake, Ltv, Optimization Strategies, Metrics, Communication Skills, Design

Industry

Information Technology/IT

Description

We are seeking a Senior Data Analyst to join our team and collaborate closely with stakeholders to design, develop, deploy, and maintain robust data pipelines using Apache Airflow.
This role requires a strong communicator with the ability to bridge the gap between business needs and technical solutions. If you are self-sufficient, responsible, and passionate about turning data into actionable insights, we encourage you to apply.

REQUIREMENTS

  • 3+ years of experience as a Data Analyst or similar role in data engineering or data science
  • Strong verbal and written communication skills, with English proficiency at B2 or higher
  • Proficiency in PySpark and Databricks, with a familiarity or transferable experience in BigQuery or Snowflake
  • Knowledge of product business metrics like LTV, ARPPU, and Retention metrics
  • Expertise in Tableau or equivalent reporting tools, with the capability to generate impactful visualizations
  • Experience with web-scale databases and database optimization strategies
  • Background in delivering end-to-end data solutions, from design to delivery
  • Familiarity with scheduling tools like Airflow or similar platforms
Responsibilities
  • Translate business needs into data-driven solutions while maintaining a focus on engineering tasks
  • Develop and deploy data pipelines in Apache Airflow to process and transform data efficiently
  • Work closely with stakeholders to provide meaningful insights on product business metrics such as LTV, ARPPU, and Retention
  • Utilize Databricks on AWS to handle large-scale web databases and optimize query performance
  • Leverage PySpark for complex data transformations and analytics
  • Create and maintain dashboards and reports using Tableau or similar reporting tools
  • Collaborate with team members via Slack/Jira to resolve issues and streamline workflows
  • Employ GitHub for version control and ensure the integrity of developed solutions
  • Analyze and interpret data trends to make informed recommendations that drive business decisions
  • Ensure data accuracy and maintain a high standard of reporting and documentation
  • Adapt and prioritize tasks to meet deadlines in a fast-paced environment
Loading...