Log in with
Don't have an account? Create an account
Need some help?
Talk to us at +91 7670800001
Log in with
Don't have an account? Create an account
Need some help?
Talk to us at +91 7670800001
Please enter the 4 digit OTP has been sent to your registered email
Sign up with
Already have an account? Log in here
Need some help?
Talk to us at +91 7670800001
Jobs Search
Start Date
Immediate
Expiry Date
20 Jun, 25
Salary
0.0
Posted On
20 Mar, 25
Experience
4 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Good communication skills
Industry
Information Technology/IT
EXPERIENCE:
· 6 - 15 Years
JOB DESCRIPTION:
· Hybrid Role: onsite- Monday, Tuesday & Thursday
· WFH: Wednesday & Friday
WHAT YOU BRING TO THE TABLE:
· Bachelor’s degree in computer science, Engineering, Mathematics, Sciences, or related field of study from an accredited college or university; will consider a combination of experience and/or education.
· Ideally 5+ years of experience in developing data and analytics solutions and approximately 4+ years data modeling and architecture.
· Highly skilled and hands on experience in Python coding.
· Highly skilled on SQL development, SQL procedure on RDMS (Oracle, SQL server, MySQL) and NoSQL Database systems.
· Highly skilled on cloud data platform tools and techniques e.g.: Google Cloud Platform, Pub/Sub, Big Query, DataForm, DataFlow, DataStream, Google cloud storage, Cloud Composer/DAG, Cloud Run, Cloud RESTAPI, ADO GITREO, CI/CD Pipelines, Terraform/YAML etc.
· Strong understanding of data quality, compliance, governance, and security.
· Strong understanding of statistical techniques.
· Advanced technology skills with deep knowledge in data segmentation and analytics platform usage.
· Commitment to consistently adhere to policies and procedures and be a positive example for others by demonstrating the Company’s core values of Respect, Accountability, Innovation, Safety, and Excellence in completing work assignments.
· Self-motivated, have phenomenal work ethic and looking for the right company to support your growth.
· An ideal candidate is intellectually curious, has a solution-oriented attitude, and enjoys learning new tools and techniques.
· You will have the opportunity to design and execute vital projects such as re-platforming our data services on Cloud and on-prem, and delivering real-time streaming capabilities to our business applications.
· Partner with the functional areas to determine how to manage and enhance our data assets to drive more value for the organization, including identifying key data sets for governance, collecting of new data to create new business opportunities, and the design of an enterprise data layer.
· Brings a clear point of view on data processing optimization, data modeling, data pipeline architecture, data SLA management.
· Holds accountability for the quality, usability, and performance of the solutions.
· Leads design sessions and code reviews to elevate the quality of engineering across the organization.
· Design, develop data foundation on cloud data platform using GCP tools and techniques e.g.: Google Cloud Platform, Pub/Sub, Big Query, DataForm, DataFlow, DataStream, Google cloud storage, Cloud Composer/DAG, Cloud Run, Cloud RESTAPI, ADO GITREO, CI/CD Pipelines, Terraform/YAML etc.
· ETL pipeline using Python builds and scalable solutions.
· Multi-level Data Curation and modeling.
· Data design and architecture.
· Hands on experience in building complete CI/CD Pipeline creation and maintenance using Azure DevOps and Terraform/Terragrunt.
· Increase the efficiency and speed of complicated data processing systems.
· Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
· Building MLOps pipelines using Python for AI/ML deployment and management at scale
· Running tests to assess AI/ML performance.
· Analyzing data gathered during tests to identify strengths and weaknesses of ML Models
· Implementing changes to algorithms to improve AI/ML performance.
· Troubleshooting and addressing problems with deployed ML to improve user experience.
· Documenting all steps in the development process
· Manage the data collection process providing interpretation and recommendations to management. Executes quantitative analysis that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control.
· increase the efficiency and speed of complicated data processing systems.
· Collaborate across all functional areas to translate complex business problems into optimal data modeling and analytical solutions that drive business value.
· Lead the improvements and advancement of reporting and data capabilities across the company, including analytics skills, data literacy, visualization, and storytelling.
· Develop a certified vs. self-service analytics framework for the organization.
· Collaborating with our Architecture group, recommend and ensure the optimal data architecture.
· Highly skilled on RDMS (Oracle, SQL server), NoSQL Database, and Messaging services (Publish / Subscribe) systems.
· Extensive knowledge/coding skills of Python including understanding of data modeling and data engineering.
· In-depth knowledge of machine learning tools like Tensor Flow, Scikit Learn, Jupyter, Pandas, NumPy, data structures and modeling, software architecture, libraries, and frameworks to create AI that accomplishes outlined goals.
· Good mathematics skills, especially in statistics, to understand algorithms