Data Engineer I at KROLL ASSOCIATES S PTE LTD
, , Canada -
Full Time


Start Date

Immediate

Expiry Date

28 Jan, 26

Salary

0.0

Posted On

30 Oct, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Pipeline Construction, ETL, Data Integration, Data Warehousing, Data Quality, Data Security, Performance Optimization, SQL, Python, C#/.NET, Java, REST API Development, Azure, Data Governance, Collaboration, Agile Methodologies

Industry

Business Consulting and Services

Description
Kroll’s Private Capital Markets (PCM) platform is transforming private asset valuation and portfolio workflows for alternative asset managers. We’re seeking a Data Engineer to design and implement secure, scalable data solutions across the PCM platform on Azure. You will collaborate closely with Product and Implementation teams to deliver client-ready analytics, robust APIs, and high-performance data pipelines that power financial workflows spanning private equity, fixed income, derivatives, and structured products. You’ll also help establish engineering standards and communities of practice across a global team of data professionals and developers. This is a hybrid role, requiring 2–3 days of on-site presence each week. Day-to-Day Responsibilities Data Pipeline Construction: Design, build, and maintain reliable data pipelines to move, transform, and integrate data from diverse sources into data warehouses or lakes. ETL and Data Integration: Develop and optimize ETL/ELT processes using tools such as Azure Data Factory, Databricks, Synapse, DBT, Airflow, or Informatica. Data Warehousing: Model and manage data warehouses to ensure efficient querying, high performance, and data quality using platforms like Azure Synapse, Snowflake, Redshift, or BigQuery. Data Quality & Monitoring: Implement validation, cleaning, and monitoring processes to ensure data accuracy, consistency, and reliability. Data Security: Apply robust data governance practices, manage access permissions, and ensure compliance with privacy regulations. Performance & Scalability: Optimize systems to handle growing data volumes and support evolving business needs. Lead and mentor cross-functional teams, driving adoption of modern data technologies and best practices. Spearhead greenfield initiatives that align with strategic business objectives, including innovation to support revenue growth and market expansion. Own key functional areas of the PCM platform to ensure operational efficiency, reliability, and peak performance. Promote collaboration and excellence by participating in architectural reviews, defining technical standards, and contributing to a culture of continuous improvement. Essential Traits Technical Expertise Proven experience building ETL/ELT pipelines using Azure, AWS, or Databricks platforms. Strong proficiency in SQL (T-SQL, PL/pgSQL, Spark-SQL) for data transformation and optimization. Skilled in Python, C#/.NET, or Java for data engineering and backend services. Hands-on experience with REST API development, Python SDKs, and containerization tools such as Docker and Kubernetes. Working knowledge of CI/CD pipelines, Git, and Azure DevOps. Data Systems & Architecture Experience with Microsoft SQL Server, PostgreSQL, and cloud-native databases. Understanding of data warehousing, dimensional modeling, and data lake architectures. Hands-on experience with data pipeline orchestration tools like Airflow, Ascend, or Azure Synapse. Exposure to data quality frameworks and monitoring best practices. Collaboration & Delivery Partner effectively with Product Owners and end users in an agile environment. Participate in code reviews, technical design sessions, and architecture discussions. Demonstrated ability to manage multiple priorities, solve complex problems, and deliver scalable solutions. Master’s degree in Computer Science, Data Science, Mathematics, Statistics, or a related field. Minimum 3 years of hands-on data engineering experience, ideally within financial services. Relevant Cloud (Azure/AWS) or Data Engineering certifications preferred. Ability to handle confidential and sensitive information with discretion. About Kroll Join the global leader in risk and financial advisory solutions—Kroll. With a nearly century-long legacy, we blend trusted expertise with cutting-edge technology to navigate and redefine industry complexities. As a part of One Team, One Kroll, you'll contribute to a collaborative and empowering environment, propelling your career to new heights. Ready to build, protect, restore and maximize our clients’ value? Your journey begins with Kroll. We are proud to be an equal opportunity employer and will consider all qualified applicants regardless of gender, gender identity, race, religion, color, nationality, ethnic origin, sexual orientation, marital status, veteran status, age or disability. In order to be considered for a position, you must formally apply via careers.kroll.com.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
Design, build, and maintain reliable data pipelines and develop ETL/ELT processes for the PCM platform. Collaborate with cross-functional teams to deliver client-ready analytics and ensure data quality and security.
Loading...