Lead Data Architect

at  FalconTek

Remote, Oregon, USA - 00000

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate11 Nov, 2022Not Specified11 Aug, 20223 year(s) or aboveAws,PythonNoNo
Required Visa Status:
US CitizenStudent Visa
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp


Position Objective


  • You will help improve our data architecture, data pipelines, enhance data security, optimize data models and database designs, and write code hands-on to process the data more efficiently and support storage, processing, and analysis of terabytes of personal health information (PHI) across millions of users
  • You will work with architectures and the CTO team to assess and research technologies, AWS services, and frameworks for our Data, Cloud, & DevSecOps pipelines



  • Documenting, improving, and maintaining data strategies and artifacts, including logical and physical data models, data dictionary, data roadmap, and data security policies, using industry best practices and adhering to federal standards
  • Managing AWS data stores such as EMR/Hive, RDS, and Redshift, while maintaining data governance, retention policies, and lifecycles
  • Collecting data access patterns and reviewing current data models to optimize designs for customer use cases
  • Standardizing data ingestion and processing pipelines to scale with increased usage and utilization
  • Auditing and reverse-engineering business rules in legacy systems, and building data connectors for integrating them into the Data and Analytics platform
  • Providing subject matter expertise and leading data and architecture review meetings
  • Reviewing and improving data governance policies and processes
  • Establishing and adhering to Service Level Agreements (SLAs) related to cloud resources’ Key Performance Indicators (KPI’s), such as Recovery Point Objective (RPO), Recovery Time Objective (RTO), Mean Time to Repair (MTTR)

Education & Experience


  • B.S. or M.S in Computer Science or related field
  • 10+ years software engineering experience (or M.S. Computer Science with 7+ years of relevant experience)
  • 5+ years of experience in cloud data architecture (AWS preferred) and big data technologies, including Databricks, EMR, Hive, Spark, AWS Glue, Redshift, and Airflow
  • 5+ years of experience in object-oriented Python software development


  • Experience working in AWS
  • Experience with data orchestration frameworks such as Apache Airflow and Luigi
  • Experience with architecting, scaling, and managing multi-tenant data platforms
  • Experience with data lake architectures and building ETL pipelines to ingest, process, and store data
  • Experience with analytical tools such as SAS Viya, Databricks, AWS Sagemaker, AWS QuickSight, and EMR Notebook/Studio
  • Knowledge of AuthN and AuthZ systems, including Active Directory, Okta, and AWS IAM Policies/Roles using attribute-based access controls
  • Strong experience with relational databases and advanced SQL
  • Knowledgeable with Jenkins CI/CD
  • Proven ability to communicate technical concepts clearly and concisely in oral and written form
  • Experience with DevOps Configuration Management and IaC tools such as Ansible, Terraform, and CloudFormation
  • Experience with security scanning tools (e.g. Nessus, BurpSuite, Netsparker, OWASP, etc.) is a plus
  • Experience with healthcare quality data including provider data, beneficiary data, claims data, and quality measure data, such as Part A, B, C and D datasets, is a plus

Job Type: Full-time
Pay: $160,000.00 per year


  • 8 hour shift


  • AWS: 5 years (Required)
  • Python: 5 years (Required)
  • Apache Airflow or Luigi: 3 years (Preferred)

Work Location: Remot


Please refer the Job description for details


Min:3.0Max:10.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering




Remote, USA