Big Data Tech Lead - Vice President at Citi
Tampa, Florida, United States -
Full Time


Start Date

Immediate

Expiry Date

13 Mar, 26

Salary

0.0

Posted On

13 Dec, 25

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Big Data, ETL, ELT, Apache Spark, Hadoop, Kafka, Python, Scala, SQL, Oracle, PostgreSQL, Apache Airflow, Docker, Kubernetes, Hazelcast, Redis

Industry

Financial Services

Description
Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation Develop comprehensive knowledge of how areas of business, such as architecture and infrastructure, integrate to accomplish business goals Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary Data Pipeline Development: Design, build, and maintain scalable ETL/ELT pipelines to ingest, transform, and load data from multiple sources. Big Data Infrastructure: Develop and manage large-scale data processing systems using frameworks like Apache Spark, Hadoop, and Kafka. Proficiency in programming languages like Python, or Scala. Strong expertise in data processing frameworks such as Apache Spark, Hadoop. Expertise in Data Lakehouse technologies (Apache Iceberg, Apache Hudi, Trino) Expertise in SQL and database technologies (e.g., Oracle, PostgreSQL, etc.). Expertise with data orchestration tool Apache Airflow is mandatory Familiarity with containerization (Docker, Kubernetes) is a plus Distributed caching solutions (Hazelcast or Redis) Prior experience with building distributed, multi-tier applications is highly desirable. Experience with building apps which are highly performant and scalable will be great Bachelor's degree/University degree or equivalent experience Master's degree preferred ------------------------------------------------------ For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Anticipated Posting Close Date: Jan 02, 2026 ------------------------------------------------------
Responsibilities
The role involves partnering with management teams to integrate functions and enhance systems for new products and process improvements. Additionally, it includes designing and maintaining scalable data pipelines and managing large-scale data processing systems.
Loading...