Data Engineer at Flow Control Group
Charlotte, NC 28217, USA -
Full Time


Start Date

Immediate

Expiry Date

12 Nov, 25

Salary

0.0

Posted On

12 Aug, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Modeling, Python, Data Analysis, Deliverables, Data Engineering, Usability, Snowflake, Data Governance, Data Products, Data Standards, Sql, Communication Skills, Traceability, Technical Requirements, Talend, Analytical Skills, Data Warehousing

Industry

Information Technology/IT

Description

ABOUT US:

Flow Control Group (FCG) is a leading provider of fluid handling, process, and industrial automation solutions across North America. We are a 100% employee-owned organization made up of over 2,000 team members and 95+ entrepreneurial brands—each empowered to think big, move fast, and bring innovative ideas to life. Our ownership mindset fuels a culture of pride, accountability, and exceptional customer service.
At FCG, we believe in the power of partnership and entrepreneurship. We work collaboratively across our brands to drive growth, unlock new opportunities, and deliver real impact for our customers. This unique model allows us to combine local expertise with national strength, creating a dynamic environment where creativity meets practicality.
Visit our website: https://flowcontrolgroup.com/

QUALIFICATIONS:

  • Experience:
  • Minimum 2+ years of experience developing and scaling dbt projects.
  • Minimum 3-5 years of experience as a data engineer with tools like SSIS, Informatica, or Talend.
  • Minimum 2+ years of experience with Snowflake.
  • Proven experience designing and implementing dbt transformations in large scale enterprise analytics environments.
  • Experience working with data pipeline tools like Fivetran or CData.
  • Proven experience designing and managing security requirements.
  • Experience in supporting data applications.
  • Minimum of 5 years working with cross-functional teams and managing multiple projects and priorities.
  • Additional Technical Skills:
  • Expert in SQL and data analysis.
  • Strong understanding of data integrations, data warehousing, data modeling, and data engineering.
  • Familiarity with Azure and cloud architectures.
  • Experience with Python and AI preferred.
  • Expert creating architecture diagrams and documenting requirements and design details.
  • Required Analytical Skills:
  • Excellent problem-solving skills and attention to detail.
  • Ability to analyze complex business processes and translate them into technical requirements.
  • Evaluate performance, traceability, and usability in all designs.
  • Expertise working to implement data warehouses and marts that align with business objectives.
  • Ability to analyze data from disparate source to identify gaps in data or definitions.
  • Ability to effectively test data products and analyze data by looking for patterns and anomalies in the data that impacts deliverables.
  • Ensure alignment with data standards and data governance.
  • Communication Skills:
  • Strong verbal and written communication skills.
  • Ability to effectively communicate with technical and non-technical stakeholders.
Responsibilities
  • Technical Expertise: Provide technical expertise in all capabilities of dbt.
  • Development: Design, build, and manage data transformation in dbt. Create reusable components and templates and proof-of-concept solutions. Develop and manage data pipelines as needed. Implement functionality in Snowflake.
  • Source to Target Mappings: Work with data team members to analyze source data and identify transformation requirements to move data to our data warehouse and data marts.
  • Data Integration: Design robust, reliable, scalable, and performant systems for large volume data processing. Help drive decisions on data architecture and engineering best practices.
  • Data Observability: Develop and enhance monitoring of workflows and processes.
  • Solution Design: Collaborate with technical teams to design and implement solutions that meet business requirements and align with best practices.
  • Security: Design security standards and ensure proper implementation of security.
  • Performance: Ensure our data warehouse processing is optimized for performance. Collaborate with team members to ensure data loads and transformations flow through the system to deliver data according to requirements.
  • Support: Provide comprehensive technical support data warehouse processing.
  • Testing and Validation: Develop testing frameworks for solutions to ensure that solutions meet requirements and perform as expected.
  • Metadata: Work with business users, technical teams, and data stewards to create documentation and metadata to support understanding of technical data warehouse infrastructure.
  • Continuous Improvement: Identify opportunities for process improvements and system enhancements to increase efficiency and effectiveness.
Loading...