Senior Data Engineer at BID Operations
Sydney, New South Wales, Australia -
Full Time


Start Date

Immediate

Expiry Date

29 May, 26

Salary

0.0

Posted On

28 Feb, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Kafka, RabbitMQ, Airflow, ClickHouse, ETL/ELT, Stream Processing, Python, SQL, Data Governance, Cloud Environments, Data Pipelines, Real-time Messaging, Schema Design, Query Performance, Monitoring

Industry

Information Services

Description
About the company: At BID Operations, we are passionate about supporting our clients in their journey towards success. Our mission is to empower you to thrive by handling the essential yet time-consuming aspects of your business operations, allowing you to concentrate on strategic growth and innovation. About the role: As a Senior Data Engineer, you will be a technical leader responsible for the architecture, scalability, and reliability of our high-throughput, real-time data ecosystem. You will oversee the evolution of our data infrastructure, leveraging Kafka, RabbitMQ, Airflow, and ClickHouse to power mission-critical financial analytics. Your role is to bridge the gap between complex business requirements and high-performance engineering, ensuring our data pipelines can handle the rigours of real-time financial data processing. Key Responsibilities: Lead the design and evolution of highly scalable, fault-tolerant ETL/ELT pipelines. Drive the strategy for real-time messaging and stream processing using Kafka and RabbitMQ to ensure sub-second data availability. Act as the subject matter expert for ClickHouse, optimising complex schema designs, indexing strategies, and query performance for large-scale financial datasets. Oversee the deployment of data services within cloud environments, implementing advanced security protocols and data governance standards essential for the finance industry. Collaborate with senior leadership to align data strategy with business objectives. Mentor data engineers through code reviews and technical guidance. Implement advanced monitoring and automated recovery systems to ensure the integrity and quality of high-stakes financial data. Requirements: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience in data engineering, with a strong background in designing and implementing ETL processes within cloud environments. Experience within the Finance or Trading technology sector, with a proven track record of handling real-time market or transactional data. Strong programming skills in Python, with experience in developing robust, maintainable, and scalable data processing pipelines. Extensive SQL knowledge and experience. Excellent problem-solving skills and the ability to work collaboratively in a team environment. Strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders. Hybrid working arrangement Opportunities for enriching career growth, including exposure to regional contexts Complimentary snacks and beverages available in the office pantry Healthcare coverage (medical, dental, optical), gym benefits Flexibility in smart casual dress code Young, vibrant and open work culture
Responsibilities
The Senior Data Engineer will lead the design and evolution of scalable, fault-tolerant ETL/ELT pipelines and drive the strategy for real-time messaging using Kafka and RabbitMQ. This role involves acting as the subject matter expert for ClickHouse, optimizing performance for large-scale financial datasets, and overseeing data service deployment in cloud environments.
Loading...