Analytics Data Engineer
at Profitero
Windsor SL4 1LP, , United Kingdom -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 21 Dec, 2024 | Not Specified | 22 Sep, 2024 | 1 year(s) or above | Looker,Tableau,Cost Management,Performance Tuning,Sql,Python,Data Processing,Cost Efficiency,Dbt,Snowflake,Time Management,Data Models,Soft Skills | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
ABOUT PROFITERO
Profitero is a leading global SaaS commerce platform that uses predictive intelligence to help brands anticipate, activate and automate their next best action to fuel profitable growth. Our technology monitors 70 million products daily, across 1200+ retailers and 50 countries, helping brands optimise search placement, product content, pricing, stock availability, reviews and more. News outlets, including Good Morning America, The Wall Street Journal and Ad Age frequently cite and trust Profitero as a source of data for their stories. Now’s an exciting time to join our fast-growth business.
Profitero has recently joined Publicis Groupe (a $13 billion global marketing services and technology company) as a standalone commerce division, infusing our business with significant product development resources and investment. while giving our employees an incredible launchpad for their careers. Profitero’s tech and data combined with Publicis’ tech, data and activation services positions us to be a true end-to-end partner for helping brands maximise eCommerce market share and profits.
Come be a part of our fast-paced, entrepreneurial culture and next stage of growth.
TECHNICAL SKILLS:
- Strong knowledge of Python and SQL with a minimum of 1+ years of practical experience in data automation. Ability to write efficient and scalable code for data processing.
- Hands-on experience with Snowflake, including performance tuning (cloud cost efficiency, data volume handling), understanding cost management, and data sharing features. Knowledge of GBQ is a plus.
- Solid understanding of designing and optimising data pipelines and data models in a cloud environment. Experience in setting up ETL processes, knowledge of dbt is a plus.
- Expertise in working with scalable data architectures, including data warehouses, data lakes, and data pipelines.
- Experience with BI tools like Looker, Tableau, Sigma or similar is a plus.
SOFT SKILLS:
- Strong problem-solving skills with the ability to troubleshoot and resolve complex data issues.
- Excellent communication and collaboration skills
- Diligence and time management
Responsibilities:
ABOUT THE ROLE:
We are developing a new portfolio of analytical services to allow our customers to analyse eCommerce data. The process behind this development includes collection, processing, and presenting a big amount of data to the customer. It is developed using Snowflake, Sigma, and Python. Right now the data is hosted in GBQ, but we are in the process of migration to Snowflake.
Profitero provides customers with data on how products perform online from various angles: prices, availability, placement, ratings and reviews, product content, etc. Customers can use dashboards, our web application or API connection to GBQ to get information about products performance online.
The migration of existing services to Snowflake will cause us to rewrite certain data pipelines to adjust our client dashboards to the new technology stack. The data engineer in this team will be responsible for implementing those pipelines working with the BI team and Architecture team.
RESPONSIBILITIES:
- Design and build scalable and resilient Data & Analytics solutions
- Automate data workflows and optimize data processing for performance and cost
- Design and development of new data pipelines. Improvement of existing data pipelines by using data engineering best practices.
- Design and develop efficient, scalable data models that enable fast and accurate reporting while minimizing cost and query complexity
- Monitor and optimize data warehouse costs, leveraging Snowflake’s cost management tools to ensure efficient data processing and storage usage
- Engage in proof of concepts and experiments
- Participate in overall testing and production maintenance
- Work closely with the BI Team and Architecture team to ensure smooth transition from GBQ to Snowflake
REQUIREMENT SUMMARY
Min:1.0Max:6.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Proficient
1
Windsor SL4 1LP, United Kingdom