Senior ETRM Data Scientist (all genders) at Wipro Limited
45127 Essen, Nordrhein-Westfalen, Germany -
Full Time


Start Date

Immediate

Expiry Date

28 Jul, 25

Salary

0.0

Posted On

29 Apr, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Reliability, Data Processing, Data Science, Pandas, Statistics, Mathematics, Linear Regression, Prophet, Machine Learning, Time Series Analysis, Deep Learning, Analytics, Numpy, Business Requirements, Darts

Industry

Information Technology/IT

Description

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.

JOB DESCRIPTION

We are looking for candidates with advanced expertise in time-series forecasting, predictive modeling, and deep learning, complemented by hands-on experience in scalable MLOps frameworks. A master’s degree in mathematics, statistics, or data science is required, with a preference for candidates holding a Ph.D. Proficiency in Azure Machine Learning, Databricks, PySpark, and optimizing performance of operational machine learning models is essential. Ideal candidates will have a proven ability to optimize models, automate workflows, and drive innovation while delivering efficient, scalable solutions in forecasting projects.

MANDATORY SKILLS

  • Master’s degree in mathematics, Statistics, Data Science, or related fields is mandatory
  • A Ph.D. in Mathematics, Statistics, Data Science, or similar areas is preferred but not mandatory
  • Data Science:
  • Extensive experience in time-series forecasting, predictive modelling, and deep learning
  • Proficient in designing reusable and scalable machine learning systems
  • Proficiency in statistical and machine learning approaches for time-series analysis, including ARIMA, LSTM, Prophet, Linear Regression, and Random Forest
  • Strong command of machine learning libraries, including scikit-learn, XGBoost, Darts, TensorFlow, and PyTorch, along with data manipulation tools like Pandas and NumPy
  • Proven track record of analysing and optimizing performance of operational machine learning models to ensure long-term efficiency and reliability
  • Expertise in retraining and fine-tuning models based on evolving data trends and business requirements
  • Proven expertise in designing and implementing explicit ensemble techniques such as stacking, boosting and bagging to improve model accuracy and robustness
  • MLOps Implementation:
  • Proficiency in leveraging Python-based MLOps frameworks for automating machine learning pipelines, including model deployment, monitoring, and periodic retraining
  • Advanced experience in using the Azure Machine Learning Python SDK to design and implement parallel model training workflows, incorporating distributed computing, parallel job execution, and efficient handling of large-scale datasets in managed cloud environments
  • PySpark Proficiency
  • Strong experience in PySpark for scalable data processing and analytics
  • Azure Expertise:
  • Azure Machine Learning: Managing parallel model training, deployment, and operationalization using the Python SDK
  • Azure Databricks: Collaborating on data engineering and analytics tasks using PySpark/Python
  • Azure Data Lake: Implementing scalable storage and processing solutions for large datasets
Responsibilities
  • Develop and implement reusable and scalable machine learning systems for time-series forecasting.
  • Analyze, retrain, and fine-tune machine learning models based on evolving data trends and business requirements to ensure long-term efficiency and reliability.
  • Automate machine learning workflows, including deployment, monitoring, and retraining, using Azure Machine Learning as an MLOps technology.
  • Utilize PySpark for efficient data processing and analytics in large-scale environments.
  • Collaborate on data engineering tasks in Azure Databricks and implement scalable storage solutions using Azure Data Lake.
Loading...