Data Engineer
at Rich Data Co
NSN2, New South Wales, Australia -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 30 Apr, 2025 | Not Specified | 31 Jan, 2025 | 2 year(s) or above | Postgresql,Data Transformation,Javascript,Business Requirements,Design,Power Bi,User Experience,Aws,Data Processing,Tableau,Communication Skills,Scalability,Security,Optimization,Dashboards,Visualization,Business Insights,Airflow,Automation,Angular | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
ABOUT RDC
Rich Data Co (RDC) Delivering the Future of Credit, Today! We believe credit should be accessible, fair, inclusive, and sustainable. We are passionate about AI and developing new techniques that leverage traditional and non-traditional data to get the right decision in a clear and explainable way. Global leading financial institutions are leveraging RDC’s AI decisioning platform to offer credit in a way that aligns to the customers’ needs and expectations. RDC uses explainable AI to provide banks with deeper insight into borrower behaviour, enabling more accurate and efficient lending decisions to businesses.
EXPERIENCE & SKILLS
- 2+ years of proven experience in Apache Airflow for building and managing data pipelines.
- Experience with both RDBMS (e.g., MySQL, PostgreSQL) and NoSQL (e.g., DynamoDB) databases.
- Experience with data pipelines or ETL processes, ensuring data is available for dashboards and reports.
- Experience with data visualization tools such as Power BI, Tableau, or similar platforms.
- Strong proficiency in Python, with experience in writing efficient and scalable data processing code.
- Strong JavaScript skills for dashboard development.
- Strong SQL skills, with the ability to write complex queries and optimize them for performance.
- Excellent communication skills to collaborate with cross-functional teams and translate business requirements into technical designs.
- Strong problem-solving skills, with the ability to troubleshoot technical issues and optimize data pipelines and dashboards.
- Experience working in cloud-based environments such as AWS, Azure, or GCP.
- Experience with agile development methodologies, working within fast-paced, iterative environments.
- Familiarity with API integration for real-time data fetching and interaction.
- Knowledge of DevOps practices and CI/CD pipelines.
Responsibilities:
PURPOSE OF ROLE
As a Data Engineer at RDC, you will be responsible for building and maintaining data pipelines, managing data integration, and developing dashboards and reports that deliver critical business insights. Your role will blend data engineering expertise with report development skills, ensuring efficient data flow and visualization. With 2+ years of experience in Airflow, Python, and JavaScript, you’ll be expected to:
- Design, develop, and maintain ETL pipelines using Apache Airflow to ensure efficient and scalable data processing.
- Use strong Python programming skills to write clean, maintainable code for data transformation and automation.
- Build dynamic and interactive dashboards using JavaScript and modern frameworks like React or Angular, ensuring an intuitive user experience.
- Manage data storage, retrieval, and optimization in both RDBMS (e.g., MySQL, PostgreSQL) and NoSQL (e.g., DynamoDB) databases.
- Collaborate with cross-functional teams to gather requirements, integrating data from various sources to create comprehensive reports and dashboards.
- Ensure scalability, performance, and security in data pipelines and visualizations while working in cloud-based environments.
ACCOUNTABILITY & OUTCOMES
- Team Stand-Up: Start your day by participating in an agile stand-up, collaborating with product, sales, and customer support teams to discuss ongoing data engineering and reporting projects.
- Design & Build: Develop ETL pipelines using Apache Airflow and design real-time, interactive dashboards using JavaScript and Python to meet business needs. Work on creating data integration solutions that ensure timely and accurate reporting.
- Data Integration: Work closely with internal teams to ensure seamless data integration from various sources, optimizing data pipelines to ensure data flows correctly into your dashboards and reports.
- Collaboration & Refinement: Collaborate with stakeholders to gather data requirements and refine report outputs based on feedback. Participate in agile ceremonies, helping clarify user stories and project timelines.
- Learning & Continuous Improvement: Stay updated on the latest trends in data engineering and dashboard development, focusing on optimizing performance and enhancing the user experience.
REQUIREMENT SUMMARY
Min:2.0Max:7.0 year(s)
Information Technology/IT
IT Software - Other
Software Engineering
Graduate
Proficient
1
North Sydney NSW 2060, Australia