Start Date
Immediate
Expiry Date
08 Oct, 25
Salary
0.0
Posted On
09 Jul, 25
Experience
0 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Data Processing, Data Governance, Machine Learning, Sql, Data Science, Data Analysis, Computer Science, Aws, Database Systems, Data Engineering, Github, Microsoft Sql Server, Microservices, Data Analytics, Spatial Data Management
Industry
Information Technology/IT
POSITION SUMMARY
The Senior Data Analyst/Data Computer Systems Engineer will provide expert-level data development and data systems engineering services for the National Park Service enterprise data management systems. This position is responsible for developing and maintaining complex data processing software, managing data pipelines, and leading data migration activities across both alphanumeric and geospatial data systems. The role requires advanced technical expertise in SQL, Python programming, and enterprise-level spatial data management to support mission-critical data infrastructure and business requirements.
REQUIRED QUALIFICATIONS:
Bachelor’s degree in Computer Science, Data Science, Engineering, or related technical field
Minimum 5 years of experience in data analysis and engineering activities including data schema development
Expert-level proficiency in SQL, Python programming, and database procedural languages
Extensive experience with data migration activities and ETL/ELT processes
Advanced knowledge of Microsoft SQL Server and PostgreSQL database systems
Strong experience with enterprise spatial data management and geodatabases
Proven ability to design and implement scalable data architectures
Experience with cloud platforms (Microsoft Azure or AWS) for data processing
Active NAC or NACI security clearance eligibility
DESIRED QUALIFICATIONS:
Advanced degree in Computer Science, Data Engineering, or related field
Certification in cloud platforms (Azure Data Engineer, AWS Data Analytics)
Experience with ESRI Enterprise Geodatabases and ArcGIS platforms
Knowledge of modern data stack technologies and data lake architectures
Experience with Azure DevOps Server and Github for code management
Understanding of data governance and metadata management principles
Experience with API development and microservices architectures
Knowledge of machine learning and advanced analytics techniques
Design and develop complex data models, database schemas, and data pipelines for enterprise systems
Lead data migration planning and implementation from legacy to modern data structures
Write advanced SQL, T-SQL, and PL/pgSQL procedural code for Microsoft SQL Server and PostgreSQL databases
Architect and implement enterprise-level spatial data management solutions within RDBMS and geodatabases
Build advanced coding and infrastructure for optimal ETL processes using Cloud and SQL technologies
Conduct complex data quality analysis and implement comprehensive business and quality rules
Design and implement quality management processes for large-scale data migration activities
Lead efforts in re-designing infrastructure for greater scalability and optimizing data delivery
Automate manual processes and implement internal process improvements
Provide technical leadership for data-related issues and infrastructure needs
Collaborate with stakeholders to identify process improvement opportunities and propose system modifications
Structure and optimize both small and large data sets for efficient retrieval and analysis
Define and maintain complex domain definition requirements and permissible values
Lead development of comprehensive metadata records for data management systems and standards
Mentor junior team members and provide technical guidance on complex data engineering challenges