(USA) Senior, Data Engineer

at  Walmart

Bentonville, AR 72712, USA -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate05 Jul, 2024Not Specified06 Apr, 20241 year(s) or aboveData Flow,Data Architecture,Etl Tools,Business Requirements,Sql,Computer Science,Transformation,Business Intelligence,Data Systems,Design,Business Analytics,Data EngineeringNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

WHAT YOU’LL BRING:

  • Advanced working SQL knowledge and experience working with relational databases, Big Query, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large, disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 6 to 10 Years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Big Query, Cassandra.
  • Experience with data pipeline and workflow management tools: Airflow, etc.
  • Experience with GCP cloud services: GCS, Dataproc, Dataplex, etc.
  • Experience with stream-processing systems: Spark-Streaming, Storm etc.
  • Experience with object-oriented/object function scripting languages: Python, Scala.

ABOUT WALMART GLOBAL TECH

Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert’s and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail.
Flexible, hybrid work:
We use a hybrid way of working that is primarily in office coupled with virtual when not onsite. Our campuses serve as a hub to enhance collaboration, bring us together for purpose and deliver on business needs. This approach helps us make quicker decisions, remove location barriers across our global team and be more flexible in our personal lives.

EQUAL OPPORTUNITY EMPLOYER:

Walmart, Inc. is an Equal Opportunity Employer - By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing diversity- unique styles, experiences, identities, ideas and opinions - while being inclusive of all people.
The above information has been designed to indicate the general nature and level of work performed in the role. It is not designed to contain or be interpreted as a comprehensive inventory of all responsibilities and qualifications required of employees assigned to this job. The full Job Description can be made available as part of the hiring process.

MINIMUM QUALIFICATIONS…

Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications.
Option 1: Bachelor’s degree in Computer Science and 3 years’ experience in software engineering or related field. Option 2: 5 years’ experience in
software engineering or related field. Option 3: Master’s degree in Computer Science and 1 year’s experience in software engineering or related
field.
2 years’ experience in data engineering, database engineering, business intelligence, or business analytics.

PREFERRED QUALIFICATIONS…

Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications.
Data engineering, database engineering, business intelligence, or business analytics, ETL tools and working with large data sets in the cloud, Master’s degree in Computer Science or related field and 3 years’ experience in software engineering

Responsibilities:

WHAT YOU’LL DO:

We are looking for a Senior Data Engineer to join our growing team of engineering experts in Finance Data Factory. The hire will be responsible for building, expanding and optimizing our data and data pipeline architecture including helping to build new FDLH platform, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building new data platform from the ground up. The Software Engineer will support our engineers, database architects and data analysts on data initiatives and will ensure optimal data delivery architecture is consistent in FDLH. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of re-designing our data architecture to support our next generation of products and data initiatives.

  • Create and maintain optimal data platform and pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP ‘big data’ technologies.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and GCP regions.
  • Create data tools for analytics team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.


REQUIREMENT SUMMARY

Min:1.0Max:10.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Software engineering or related field

Proficient

1

Bentonville, AR 72712, USA