Senior Big Data Developer

at  Brown Brothers Harriman

Kraków, małopolskie, Poland -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate07 Nov, 2024Not Specified09 Aug, 2024N/ADesign,Kafka,Spark,Git,Performance Tuning,Design Patterns,Production Implementation,Java,Scala,Python,Snowflake,Communication Skills,Jira,Agile Methodologies,Computer Science,Testing,Programming LanguagesNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

At BBH we value diverse backgrounds, so if your experience looks a little different from what we’ve outlined and you think you can bring value to the role, we will still welcome your application!
What You Can Expect At BBH:
If you join BBH you will find a collaborative environment that enables you to step outside your role to add value wherever you can. You will have direct access to clients, information and experts across all business areas around the world. BBH will provide you with opportunities to grow your expertise, take on new challenges, and reinvent yourself—without leaving the firm. We encourage a culture of inclusion that values each employee’s unique perspective. We provide a high-quality benefits program emphasizing good health, financial security, and peace of mind. Ultimately we want you to have rewarding work with the flexibility to enjoy personal and family experiences at every career stage. Our BBH Cares program offers volunteer opportunities to give back to your community and help transform the lives of others.
Brown Brothers Harriman is seeking a Senior Big Data Developer with working experience on Cloudera and Snowflake to help with the development of a new Data platform, infoDataFabric. BBH’s Data platform serves as the foundation for a key set of offerings running on Oracle Exadata and Cloudera’s distribution

Key Responsibilities Include:

  • Facilitate the establishment of a secure data platform on BBH’s OnPrem Cloudera infrastructure
  • Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming
  • Leverage the following (but not limited to) components of Cloudera distribution to achieve project objectives: Sqoop, Hive, Impala, Spark
  • Consistent practice in coding and unit testing
  • Work with distributed teams

What we offer:

  • 2 additional days added to your holiday calendar for Culture Celebration and Community Service
  • Private medical care for you and your family
  • Life Insurance
  • Hybrid Working Opportunities
  • Professional trainings and qualification support
  • Thrive Wellbeing Program
  • Online benefit platform
  • Contracts for an indefinite period of time with no probation period

Qualifications for your role would include:

  • Bachelor’s degree in Computer Science or related technical field, or equivalent experience
  • 8+ years of experience in an IT, preliminary on hands on development
  • Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development.
  • Strong hands on experience with programming languages - Java, Scala or Python
  • 4+ years’ real project experience as a data wrangler/engineer across design, development, testing, and production implementation for Big Data projects, processing large volumes of structured/unstructured data
  • Strong hands-on experience with Snowflake, Spark and Kafka
  • Experience with Oracle database engine with PL/SQL and performance tuning of SQL Queries
  • Experience in designing efficient and robust ETL/ELT workflows and schedulers
  • Communication skills – both written and verbal, strong analytical and problem-solving skills
  • Experience working with Git, Jira, and Agile methodologies

Nice To Have:

  • End-to-end development life-cycle support and SDLC processes
  • Working experience with Data Virtualization tools such as Dremio/Denodo
  • Knowledge of Machine Learning libraries and exposure to Data Mining
  • Working experience with AWS/Azure/GCP
  • Working experience in a Financial industry is a plus

Responsibilities:

Key Responsibilities Include:

  • Facilitate the establishment of a secure data platform on BBH’s OnPrem Cloudera infrastructure
  • Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming
  • Leverage the following (but not limited to) components of Cloudera distribution to achieve project objectives: Sqoop, Hive, Impala, Spark
  • Consistent practice in coding and unit testing
  • Work with distributed team

Qualifications for your role would include:

  • Bachelor’s degree in Computer Science or related technical field, or equivalent experience
  • 8+ years of experience in an IT, preliminary on hands on development
  • Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development.
  • Strong hands on experience with programming languages - Java, Scala or Python
  • 4+ years’ real project experience as a data wrangler/engineer across design, development, testing, and production implementation for Big Data projects, processing large volumes of structured/unstructured data
  • Strong hands-on experience with Snowflake, Spark and Kafka
  • Experience with Oracle database engine with PL/SQL and performance tuning of SQL Queries
  • Experience in designing efficient and robust ETL/ELT workflows and schedulers
  • Communication skills – both written and verbal, strong analytical and problem-solving skills
  • Experience working with Git, Jira, and Agile methodologie


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - Application Programming / Maintenance

Software Engineering

Graduate

Computer science or related technical field or equivalent experience

Proficient

1

Kraków, małopolskie, Poland