Senior Data Engineer (SSRS) - GP
at Gorilla Logic
Desde casa, Cauca, Colombia -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 26 Dec, 2024 | Not Specified | 28 Sep, 2024 | 5 year(s) or above | Aws,Scikit Learn,Azure,Communication Skills,Snowflake,Languages,Java,Microsoft Sql Server,Computer Science,Python,Linux,Apache Spark,Predictive Analytics,Bash | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
SENIOR DATA ENGINEER
Gorilla Logic is looking for a Senior Data Engineer. This is a unique and highly technical role responsible for data analysis within the enterprise. This role works with multiple data sources, data aggregation into data lakes, and data transformation into value. Our environment will require you to work effectively with your teammates, of course. But your real success will be measured by how well you couple critical thinking with self-motivation, enthusiasm, and determination.
TECHNICAL REQUIREMENTS
- Bachelor’s degree in Computer Science or related field (or equivalent experience)
- 5+ years development and/or data engineering experience
- 5+ years experience with Microsoft SQL Server and SSRS
- 3+ years experience using languages such as Python, Java, and/or .Net/C#
- Exceptional experience with complex SQL queries
- Familiarity with data warehouse schemas
- Must have the ability to work in a dynamic, fast-paced environment
- Strong communication skills to interact across within the team and across the business
- Good Analytical thinking and problem-solving skills
BONUS SKILLS
- Data warehouse experience using technologies like Snowflake, AWS Redshift and/or Google BigQuery
- Experience building and/or maintaining an ETL solutions with technologies like Azure Data Factory, AWS Glue, Databricks, Apache Spark, etc.
- Experience with predictive analytics using technologies like Jupyter Notebook, Scikit Learn, or TensorFlow
- Cloud experience in AWS, Azure or GCP
- Experience with Linux and scripting with Bash
Responsibilities:
- Analysis, organization, and integration of raw data sources
- Build data systems, data models and data pipelines
- Work with key client stakeholders to evaluate business needs and priorities
- Data interpretation for trending, patterns, value creation
- Conduct complex data analysis and develop reports
- Enhance data quality, reliability, efficiency, and value
REQUIREMENT SUMMARY
Min:5.0Max:10.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Computer science or related field (or equivalent experience
Proficient
1
Desde casa, Colombia