AI/ML Developer, Airline Services
at DataArt
Monterrey, N. L., Mexico -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 13 Feb, 2025 | Not Specified | 17 Nov, 2024 | 2 year(s) or above | Civil Engineering,Packaging,Python,Testing,Data Extraction,Transformation,Sql,Scripting,Google Cloud Platform,Reliability,Shell Scripting,Modeling,Docker,Unix,Scalability,Design,Computer Science,Data Manipulation | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Position overview: We are seeking an AI/ML Developer with a background in Industrial Civil Engineering, Computer Science, or a related field. Ideal candidates will have at least 2 years of experience in roles such as Software Engineer, Data Engineer, or Machine Learning Engineer. This position requires strong proficiency in Python, UNIX, SQL, CI/CD pipelines, Docker, and data processing workflows. Familiarity with Google Cloud Platform (GCP) is a plus.
- Responsibilities: Design, develop, and deploy machine learning models that address business challenges or optimize industrial processes
- Implement data pipelines and ETL processes to prepare and preprocess data for model training and deployment
- Work closely with cross-functional teams, including data engineers, software developers, and product managers, to integrate ML solutions into production
- Set up CI/CD pipelines to streamline the development and deployment process, ensuring smooth integration of ML models into production environments
- Use Docker to containerize applications for reliable and efficient deployment
- Maintain and optimize existing ML systems, ensuring scalability, efficiency, and reliability
- Document processes, create reusable scripts, and follow best practices for coding, testing, and version control
- Requirements: Advanced proficiency in Python for data manipulation, modeling, and scripting
- Strong experience with UNIX systems, including shell scripting and system commands
- Proficiency in SQL for querying, managing, and processing large datasets
- Familiarity with CI/CD pipelines for streamlined development, testing, and deployment
- Hands-on experience with Docker for developing, packaging, and deploying applications
- Ability to work with large-scale datasets, including data extraction, transformation, and loading (ETL) processes
- Basic knowledge of Google Cloud Platform (GCP), particularly for data storage, model deployment, or cloud-based ML workflows
Responsibilities:
- Responsibilities: Design, develop, and deploy machine learning models that address business challenges or optimize industrial processes
- Implement data pipelines and ETL processes to prepare and preprocess data for model training and deployment
- Work closely with cross-functional teams, including data engineers, software developers, and product managers, to integrate ML solutions into production
- Set up CI/CD pipelines to streamline the development and deployment process, ensuring smooth integration of ML models into production environments
- Use Docker to containerize applications for reliable and efficient deployment
- Maintain and optimize existing ML systems, ensuring scalability, efficiency, and reliability
- Document processes, create reusable scripts, and follow best practices for coding, testing, and version control
- Requirements: Advanced proficiency in Python for data manipulation, modeling, and scripting
- Strong experience with UNIX systems, including shell scripting and system commands
- Proficiency in SQL for querying, managing, and processing large datasets
- Familiarity with CI/CD pipelines for streamlined development, testing, and deployment
- Hands-on experience with Docker for developing, packaging, and deploying applications
- Ability to work with large-scale datasets, including data extraction, transformation, and loading (ETL) processes
- Basic knowledge of Google Cloud Platform (GCP), particularly for data storage, model deployment, or cloud-based ML workflow
REQUIREMENT SUMMARY
Min:2.0Max:7.0 year(s)
Information Technology/IT
IT Software - Application Programming / Maintenance
Software Engineering
Graduate
Proficient
1
Monterrey, N. L., Mexico