Senior ETL Data Engineer at Empower Annuity Insurance Company of America
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

26 Mar, 26

Salary

0.0

Posted On

26 Dec, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, SQL, AWS, Glue, EMR, Lambda, S3, Kafka, CI/CD, Data Engineering, Software Architecture, Agile, Debugging, Data Quality, Performance Tuning, Technical Leadership

Industry

Financial Services

Description
Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them. Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself. The Senior Software Engineer - Data / ETL is a seasoned, hands-on engineer responsible for designing and implementing complex backend data pipelines and ETL solutions. This role leads technical delivery for key components, provides guidance to other engineers, and helps shape the data architecture on AWS. The senior engineer is a key contributor to the reliability, scalability, and performance of data platforms that support analytics, reporting, and downstream applications. What you will do: Lead the design, development, and enhancement of ETL pipelines and backend data workflows using Python and SQL. Design and optimize large-scale data processing solutions on AWS using services such as Glue, EMR, Lambda, and S3. Lead the implementation and improvement of CI/CD pipelines to automate build, test, and deployment for data and application code. Collaborate with architects, product owners, and business partners to refine requirements and translate them into scalable, secure data solutions that follow approved architecture patterns. Write high quality, maintainable, and well-tested code, and set a strong example for coding and design practices. Investigate and resolve complex technical issues, data quality problems, and performance bottlenecks across environments. Participate in and often lead code reviews, helping to maintain high standards for code quality, testing, and documentation. Mentor junior and mid-level engineers through pairing, feedback, and knowledge sharing. Contribute to the evolution of data and application architecture, development processes, and engineering best practices. Collaborate with cross-functional teams to debug, improve, and support data products and services in production. Continuously learn and adopt modern technologies, tools, and development practices relevant to data engineering and cloud platforms. What you will bring: Bachelor’s degree in Computer Science or related field, or equivalent practical experience. 5+ years of experience in software development, with substantial experience in backend or data engineering. Strong proficiency in Python for backend development and data processing. Strong proficiency in SQL, including performance tuning and working with large or complex datasets. Hands-on experience with AWS services, ideally including Glue, EMR, Lambda, and S3, and Kafka topics Demonstrated experience designing and maintaining CI/CD pipelines and related tooling. Deep understanding of software architecture and design principles for data-intensive systems. Strong problem-solving, debugging, and analytical skills, with a track record of resolving complex issues. Proven experience providing technical leadership on features or projects and influencing technical decisions within a team. Experience working in Agile or iterative delivery environments. What will set you apart: Experience with Java or JavaScript is a plus, but not required. This job description is not intended to be an exhaustive list of all duties, responsibilities and qualifications of the job. The employer has the right to revise this job description at any time. You will be evaluated in part based on your performance of the responsibilities and/or tasks listed in this job description. You may be required perform other duties that are not included on this job description. The job description is not a contract for employment, and either you or the employer may terminate employment at any time, for any reason, as per terms and conditions of your employment contract. We are an equal opportunity employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to age, race, color, national origin, ancestry, sex, sexual orientation, gender, gender identity, gender expression, marital status, pregnancy, religion, physical or mental disability, military or veteran status, genetic information, or any other status protected by applicable state or local law. Joining our talent community will allow you to stay connected via email and receive news and updates. Want the latest money news and views shaping how we live, work and play? Sign up for Empower’s free newsletter and check out The Currency. Remote and Hybrid Positions For remote and hybrid positions you will be required to provide reliable high-speed internet with a wired connection as well as a place in your home to work with limited disruption. You must have reliable connectivity from an internet service provider that is fiber, cable or DSL internet. Other necessary computer equipment will be provided. You may be required to work in the office if you do not have an adequate homework environment and the required internet connection. Follow Empower Facebook, LinkedIn, X, Instagram, & Glassdoor

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
Lead the design, development, and enhancement of ETL pipelines and backend data workflows. Collaborate with architects and business partners to create scalable and secure data solutions.
Loading...