Senior Data Modeler- Federal Labor Category at JP Techno Park
Washington, DC 20551, USA -
Full Time


Start Date

Immediate

Expiry Date

16 Nov, 25

Salary

64.0

Posted On

16 Aug, 25

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Data Science, Etl, Aws, Scala, Spark

Industry

Information Technology/IT

Description

Senior Data Modeler
Personnel Qualifications: At least ten or more years of experience in AI, Data Science, Software Engineering experience, including knowledge of Data ecosystemBachelor’s degree in Computer Science, Information Systems, or other related field is required or related work experienceData Modeling: Expertise in designing and implementing data models optimized for storage, retrieval, and analytics within Databricks on AWS, including conceptual, logical, and physical data modelingDatabricks Proficiency: In-depth knowledge and hands-on experience with AWS Databricks platform, including Databricks SQL, Runtime, clusters, notebooks, and integrations.ELT (Extract, Load, Transform) Processes: Proficiency in developing ETL pipelines to extract data from various sources, transform it as per business requirements, and load it into the central data lake using Databricks tools and SparkData Integration: Experience integrating data from heterogeneous sources (relational databases, APIs, files) into Databricks while ensuring data quality, consistency, and lineagePerformance Optimization: Ability to optimize data processing workflows and SQL queries in Databricks for performance, scalability, and cost-effectiveness, leveraging partitioning, clustering, caching, and Spark optimization techniquesData Governance and Security: Understanding of data governance principles and implementing security measures to ensure data integrity, confidentiality, and compliance within the centralized data lake environmentAdvanced SQL and Spark Skills: Proficiency in writing complex SQL queries and Spark code (Scala/Python) for data manipulation, transformation, aggregation, and analysis tasks within Databricks notebooksCloud Architecture: Understanding of cloud computing principles, AWS architecture, and services for designing scalable and resilient data solutionsData Visualization: Basic knowledge of data visualization tools (e.g. Tableau) to create insightful visualizations and dashboards for data analysis and reporting purposesFamiliarity with government cloud deployment regulations/compliance policies such as FedRAMP, FISMA, etc.
Capabilities: Leverage financial industry expertise to define conceptual, logical and physical data models in Databricks to support new and existing business domainsWork with product owners, system architects, data engineers, and vendors to create data models optimized for query performance, compute and storage costsDefine best practices for the implementation of the Bronze/Silver/Gold data layers of the lakehouseProvide data model documentation and artifacts generated from data, data dictionary, data definitions, etc.
Job Type: Contract
Pay: $60.00 - $64.00 per hour
Expected hours: 40 per week

Experience:

  • Data Modeler: 10 years (Required)
  • Federal: 10 years (Required)
  • Labor: 10 years (Required)
  • AI: 10 years (Required)
  • Data Science: 10 years (Required)
  • Data Ecosystem: 10 years (Required)
  • AWS: 10 years (Required)
  • Databricks: 10 years (Required)
  • ETL: 10 years (Required)
  • Spark: 10 years (Required)
  • Scala: 10 years (Required)
  • Python: 10 years (Required)
  • Financial Industry: 10 years (Required)

Work Location: In perso

Responsibilities

Please refer the Job description for details

Loading...