Senior Data Engineer

at  Saras Analytics

Bayern, Bayern, Germany -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate24 Nov, 2024Not Specified31 Aug, 2024N/AData Processing,Communication Skills,Soft SkillsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

JOB DESCRIPTION

About Saras Analytics:
We are an ecommerce focused end to end data analytics firm assisting enterprises & brands in data driven decision making to maximize business value. Our suite of work spans extraction, transformation, visualization & analysis of data delivered via industry leading products, solutions & services. Our flagship product is Daton, an ETL tool. We have now ventured into building exciting ease of use data visualization solutions on top of Daton. And lastly, we have a world class data team which understands the story the numbers are telling and articulates the same to CXOs thereby creating value.
Where we are Today:
We are a boot strapped, profitable & fast growing (2x y-o-y) startup with old school value systems. We play in a very exciting space which is intersection of data analytics & ecommerce both of which are game changers. Today, the global economy faces headwinds forcing companies to downsize, outsource & offshore creating strong tail winds for us. We are an employee first company valuing talent & encouraging talent and live by those values at all stages of our work without comprising on the value we create for our customers. We strive to make Saras a career and not a job for talented folks who have chosen to work with us.
The Role:
We are seeking a seasoned and proficient Senior Python Data Engineer with substantial experience in cloud technologies. As a pivotal member of our data engineering team, you will play a crucial role in designing, implementing, and optimizing data pipelines, ensuring seamless integration with cloud platforms. The ideal candidate will possess a strong command of Python, data engineering principles, and a proven track record of successful implementation of scalable solutions in cloud environments.
Responsibilities:

  1. Data Pipeline Development:
  • Design, develop, and maintain scalable and efficient data pipelines using Python and cloud-based technologies.
  • Implement Extract, Transform, Load (ETL) processes to seamlessly move data from diverse sources into our cloud-based data warehouse.
  1. Cloud Integration:
  • Utilize cloud platforms (e.g., Google Cloud, AWS, Azure) to deploy, manage, and optimize data engineering solutions.
  • Leverage cloud-native services for storage, processing, and analysis of large datasets.
  1. Data Modelling and Architecture:
  • Collaborate with data scientists, analysts, and other stakeholders to design effective data models that align with business requirements.
  • Ensure the scalability, reliability, and performance of the overall data infrastructure on cloud platforms.
  1. Optimization and Performance:
  • Continuously optimize data processes for improved performance, scalability, and cost-effectiveness in a cloud environment.
  • Monitor and troubleshoot issues, ensuring timely resolution and minimal impact on data availability.
  1. Quality Assurance:
  • Implement data quality checks and validation processes to ensure the accuracy and completeness of data in the cloud-based data warehouse.
  • Collaborate with cross-functional teams to identify and address data quality issues.
  1. Collaboration and Communication:
  • Work closely with data scientists, analysts, and other teams to understand data requirements and provide technical support.
  • Collaborate with other engineering teams to seamlessly integrate data engineering solutions into larger cloud-based systems.
  1. Documentation:
  • Create and maintain comprehensive documentation for data engineering processes, cloud architecture, and pipelines.

Technical Skills:
1. Programming Languages: Proficiency in Python for data engineering tasks, scripting, and automation.

  1. Data Engineering Technologies:
  • Extensive experience with data engineering frameworks like distributed data processing.
  • Understanding and hands-on experience with workflow management tools like Apache Airflow.
  1. Cloud Platforms:
  • In-depth knowledge and hands-on experience with at least one major cloud platform: AWS, Azure, or Google Cloud.
  • Familiarity with cloud-native services for data processing, storage, and analytics.
  1. ETL Processes: Proven expertise in designing and implementing Extract, Transform, Load (ETL) processes.
  2. SQL and Databases: Proficient in SQL with experience in working with relational databases (e.g., PostgreSQL, MySQL) and cloud-based database services.
  3. Data Modeling: Strong understanding of data modeling principles and experience in designing effective data models.
  4. Version Control: Familiarity with version control systems, such as Git, for tracking changes in code and configurations.
  5. Collaboration Tools: Experience using collaboration and project management tools for effective communication and project tracking.
  6. Containerization and Orchestration: Familiarity with containerization technologies (e.g., Docker) and orchestration tools (e.g., Kubernetes).
  7. Monitoring and Troubleshooting: Ability to implement monitoring solutions and troubleshoot issues in data pipelines.
  8. Data Quality Assurance: Experience in implementing data quality checks and validation processes.
  9. Agile Methodologies: Familiarity with agile development methodologies and practices.

Soft Skills:

  • Strong problem-solving and critical-thinking abilities.
  • Excellent communication skills, both written and verbal.
  • Ability to work collaboratively in a cross-functional team environment.
  • Attention to detail and commitment to delivering high-quality solutions.

If you possess the required technical skills and are passionate about leveraging cloud technologies for data engineering, we encourage you to apply. Please submit your resume and a cover letter highlighting your technical expertise and relevant experience.

Responsibilities:

Please refer the Job description for details


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Bayern, Germany