Database Engineer – Enterprise Data & AI Architecture at Utility Energy Services
Chicago, IL 60607, USA -
Full Time


Start Date

Immediate

Expiry Date

07 Nov, 25

Salary

120000.0

Posted On

08 Aug, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Sql, Platforms, Data Analytics, It, User Experience, Power Bi, Database Design, Vision Insurance, Azure, Dental Insurance, Excel, Data Architecture, Health Insurance, Computer Science, Salesforce, Firewalls, Trend Analysis, Synchronization, Ownership

Industry

Information Technology/IT

Description

ABOUT UES

Utility Energy Services (UES) provides energy efficiency program implementation services to utilities across the U.S. We support the design, execution, and optimization of utility energy efficiency programs by leveraging data-driven insights and industry best practices.

POSITION OVERVIEW

UES is seeking afoundational Database Engineer to lead the architecture and implementation of our enterprise data ecosystem. This is an independent contributor role on the Technology Solutions team, focused on designing and building modern, scalable, and secure data systems that power UES’s internal operations and client-facing energy programs.
The ideal candidate is a Microsoft Azure expert, AI/LLM innovator, and self-starting systems thinker who thrives on solving complex data challenges. You’ll be the go-to advisor for data architecture, security, and AI strategy across the enterprise, working directly with the Technology Solutions Manager to shape long-term technology policies and frameworks.

QUALIFICATIONS & EXPERIENCE

  • Bachelor’s degree in Computer Science, Information Systems, Engineering, Data Analytics, or a related technical field required
  • 5+ years of experience in a technical data role such as Data Engineer, Data Architect, or Solutions Engineer, ideally in a cross-functional or enterprise environment
  • Strong proficiency in Microsoft Azure data services, including Azure SQL, Data Factory, Data Lake, and Blob Storage
  • Proven experience designing and managing ETL/ELT pipelines and API integrations across platforms such as Salesforce, Microsoft tools, and client-facing systems
  • Deep understanding of relational database design, data normalization, and scalable schema architecture
  • Demonstrated ability to implement data security frameworks, including SSO, role-based access, firewalls, and secure authentication practices
  • Hands-on experience working with Salesforce data architecture, including API access, schema alignment, and synchronization with external systems
  • Experience designing and deploying custom AI and LLM-powered tools over enterprise datasets (e.g., Power BI, Excel, SQL) to support natural language querying, automated QA/QC, and trend analysis
  • Ability to translate complex technical concepts into actionable insights for business, program, and leadership stakeholders
  • Proven ability to work as an independent contributor who operates with autonomy, ownership, and a strategic mindset
  • Strong communication and documentation skills, with a focus on building scalable, transferable systems across departments
    Job Type: Full-time
    Pay: $110,000.00 - $120,000.00 per year

Benefits:

  • 401(k)
  • 401(k) matching
  • Dental insurance
  • Health insurance
  • Paid time off
  • Vision insurance

Application Question(s):

  • Describe one way you’ve built or used AI or LLM-based tools to improve a data process or user experience. What problem did it solve, and how did it change the way people interacted with the data?
  • Have you ever built a data system or tool that was handed off to another team (like analysts or program staff)? How did you ensure it was easy for them to adopt, use, and maintain?
  • Briefly describe a data system you’ve built where security or access control was a key requirement. What steps did you take to secure the data (e.g., permissions, SSO, firewalls, etc.)?
  • You are tasked with connecting a utility client’s CIS database to Azure for long-term program reporting. How would you approach the integration? Include API considerations, security options, how you’d clean and structure the data, and what you’d consider when designing long-term storage.
  • Describe an ETL or ELT process you’ve built from scratch. What tools did you use, how did you ensure quality and reusability, and how did the design support downstream reporting or analytics?

Education:

  • Bachelor’s (Required)

Experience:

  • Data Engineering: 5 years (Required)
  • Data governance: 1 year (Required)

Ability to Commute:

  • Chicago, IL 60607 (Required)

Work Location: Hybrid remote in Chicago, IL 6060

Responsibilities
  • Design and implement foundational enterprise data architecture in Microsoft Azure to support both internal operations and utility client-facing programs
  • Lead the development of structured, scalable databases and data models that enable consistent, high-quality reporting across a portfolio of energy efficiency programs
  • Develop and manage end-to-end ETL/ELT pipelines and API-based integrations to support secure, automated data exchange between Salesforce, Microsoft Azure services, utility client CIS systems, and internal tools
  • Establish and maintain robust data quality assurance, validation, monitoring, and security protocols across enterprise databases to ensure reliability, accuracy, and compliance
  • Collaborate with program-level business intelligence analysts to transition database ownership and documentation, enabling effective use of Power BI and other reporting tools for client deliverables
  • Design and deploy custom AI and LLM-powered solutions that allow internal teams to explore data using natural language, identify trends and anomalies, and automate QA processes
  • Serve as a strategic partner to the Technology Solutions Manager and leadership team in shaping data and AI strategy, architecture standards, and long-term technology planning
  • Operate as a highly autonomous individual contributor, capable of working across teams and translating complex technical concepts into clear, actionable insights for both technical and non-technical stakeholders
  • Contribute to the development of enterprise data governance practices, including system documentation, access control, and scalable solution design
Loading...