Senior Data Integration Engineer
at EPAM Systems Inc
Praha, Praha, Czech -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 22 Apr, 2025 | Not Specified | 23 Jan, 2025 | N/A | Data Analytics,Visualization,Hipaa,Azure,Cloud Services,Security,Relational Databases,Programming Languages,Etl,Talend,Data Solutions,Coding Experience,Business Requirements,Sql Server,Aws,Testing,Scala,R,Sql,Production Experience,Data Migration,Olap | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
We are currently looking for a Senior Data Integration Engineer to join our Prague office. PROJECT
REQUIREMENTS
- At least 3 years of relevant development experience and practice with data management, data storage, data modeling, data analytics, data migration, and database design
- Practical hands-on experience in developing Data Solutions in at least one major public Cloud environment (AWS, Azure, GCP)
- Practical knowledge of leading cloud data warehousing solutions (e.g. Redshift, Azure Synapse Analytics, Google BigQuery, Snowflake, etc.)
- Production coding experience in one of the data-oriented programming languages
- Solid background in developing Data Analytics & Visualization, Data Integration or DBA & Cloud Migration Solutions
- Experienced and highly self-motivated professional with outstanding analytical and problem-solving skills
- Play the role of a Key Developer and a Designer or a Team Lead of 2-5 engineers and ensure that delivered solutions meet business requirements and expectations
- Able to read and understand project and requirement documentation; able to create design and technical documentation including high-quality documentation of his/her code
- Experienced in working with modern Agile developing methodologies and tools
- Able to work closely with customers and other stakeholders
- Advanced knowledge of Data Integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Pentaho, Apache NiFi, KNIME, SSIS, etc.)
- Advanced knowledge of Relational Databases (SQL optimization, Relations, Stored Procedures, Transactions, Isolation Levels, Security)
- Practical hands-on experience of development of Data Solutions in Cloud environments (AWS, Azure, GCP) - designing, implementing, deploying, and monitoring scalable and fault-tolerant data solutions
- Solid understanding of core cloud technologies and approaches. Awareness of niche and case-specific cloud services
- Expected ability to troubleshoot the outages of average complexity, identify and trace performance issues
- Pattern-driven solutioning, choosing the best for particular business requirements and technical constraints
- Advanced knowledge of Data Security (Row-level data security, audit, etc.)
- Production experience of one of the data-oriented programming languages: SQL, Python, SparkSQL, PySpark, R, Bash
- Production projects experience in Data Management, Data Storage, Data Analytics, Data Visualization, Data Integration, MDM (for MDM profiles), Disaster Recovery, Availability, Operation, Security, etc
- Experience with data modeling (OLAP, OLTP, ETL and DWH / Data Lake /Delta Lake/ Data Mesh methodologies. Inman vs Kimbal, Staging areas, SCD and other dimension types)
- Good understanding of Online and streaming integrations, micro-batching, Understanding of CDC methods and delta extracts
- General understanding of Housekeeping processes (archiving, purging, retention policies, hot/cold data, etc.)
- Good understanding of CI/CD principles and best practices. Understanding of concepts of “Canary release”, Blue-Green, Red-Black deployment models
- Data-oriented focus and possessing compliance awareness, such as PI, GDPR, HIPAA
- Experience in direct customer communications
- Experienced in different business domains
- English proficiency
Responsibilities:
- Design and implement Data Integration solutions, model databases, and contribute to building data platforms using classic Data technologies and tools (Databases, ETL/ELT technology & tools, MDM tools, etc.) as well as implementing modern Cloud or Hybrid data solutions
- Work with product and engineering teams to understand data product requirements, evaluate new features and architecture to help and drive decisions
- Build collaborative partnerships with architects and key individuals within other functional groups
- Perform detailed analysis of business problems and technical environments and use this in designing high-quality technical solutions
- Actively participate in code review and testing of solutions to ensure it meets specification and quality requirements
- Build and foster a high-performance engineering culture, supervise junior/middle team members and provide them technical leadership
- Write project documentation
- Be self-managing, implement functionality without supervision, test his/her work thoroughly using test cases, and/or supervise less experienced colleagues
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Proficient
1
Praha, Czech