Guidewire Data Architect at GUIDEWIRE SOFTWARE INC
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

18 Jun, 26

Salary

0.0

Posted On

20 Mar, 26

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Architect, ETL Architecture, Data Platform, Ingestion Pipelines, Data Transformation, Apache Spark, Scala, PySpark, P&C Insurance Domain, DataBridge, Agile, Cloud Platforms, CDC Frameworks, Design Review, Leadership, AI Application

Industry

Insurance

Description
Summary Same as 5724, but for Architect (in Field Consulting) Job Description As a Data Architect for the Data and Analytics team, at Guidewire you will participate in a leadership capacity collaborating with our customers and SI Partners who are adopting our Guidewire Data Platform as the centerpiece of their data foundation. You will work to drive consensus among all participants in projects to realize ETL Architecture,design, implement, tune, and maintain ingestion pipelines as part of broader data platform and analytics programs goals of our customers adhering to Guidewire best practices and standards. You will work with our customers, partners, and other Guidewire team members to deliver successful data transformation initiatives. You will utilize best practices for design, development and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics team to enable predictable project outcomes. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who are self-motivated and take proactive actions for the benefit of our customers and ensure that they succeed in their journey to Guidewire Cloud Platform. You will collaborate closely with teams located around the world and adhere to our core values Integrity, Collegiality, and Rationality. Key Responsibilities: Able to Design and propose high level design diagrams for overall system architecture Lead Inception Sessions to build consensus on project scope, and generate user stories with acceptance criteria. You will analyze complex data structures in legacy source systems and help the business and data analysts to map them to the Guidewire InsuranceSuite data model. You will design, implement, tune, and maintain DataBridge‑based ingestion pipelines as part of broader data platform and analytics programs Size clusters, tune Spark jobs, and optimize ingestion to meet customer throughput, latency, and bulk‑load SLAs. You will manage the end-to-end data loading process, including pre-load validation, execution of load jobs, and post-load verification. Work with the PMO providing domain expertise to achieve successful project outcomes. Lead efforts to harvest project collateral to build out collateral to enable repeatability. Create new tooling to streamline data processing when called upon or when the opportunity presents itself Systematic problem-solving approach, coupled with a sense of ownership and drive Ability to work independently in a fast-paced Agile environment At Guidewire, we foster a culture of curiosity, innovation, and responsible use of AI—empowering our teams to continuously leverage emerging technologies and data-driven insights to enhance productivity and outcomes. Qualifications: 10 years + in a technical capacity building out complex ETL Data Integration frameworks. Hands-on experience with Apache Spark (DataFrames, partitioning, performance tuning) and at least one of Scala or PySpark Deep understanding of the P&C insurance domain, including concepts like policies, claims, billing, and underwriting. 5 + years of Experience with data processing and ETL (Extract, Transform, Load) and ELT (Extract, Load, and Transform) concepts. At least 3 years of experience leading design review sessions to build consensus among diverse groups of technical and business participants. At least 3 years in a leadership capacity for delivering complex data transformation initiatives including mentoring and team building. Experience working in Guidewire Data Migration/Data Platform projects. Experience integrating data pipelines with enterprise schedulers and notification channels to support production SLAs. Experience working with different cloud platforms (such as AWS, Azure, Snowflake, Google Cloud, etc.) Experience working with customer teams to understand business objectives and functional requirements. Familiarity in Change Data Capture (CDC) frameworks Ability to work independently and within a team. Demonstrated ability to embrace AI and apply it to your current role as well as data-driven insights to drive innovation, productivity, and continuous improvement. Nice to have: Insurance industry experience Experience with the Guidewire Data Platform Previous large scale data migration experience of successfully migrating data from mainframe or other complex legacy systems. Proven track record in Data engineering / big‑data platform roles, with at least 2+ years working with Spark‑based ingestion pipelines on cloud platforms. About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently. We combine digital, core, analytics, and AI to deliver our platform as a cloud service. More than 540+ insurers in 40 countries, from new ventures to the largest and most complex in the world, run on Guidewire. As a partner to our customers, we continually evolve to enable their success. We are proud of our unparalleled implementation track record with 1600+ successful projects, supported by the largest R&D team and partner ecosystem in the industry. Our Marketplace provides hundreds of applications that accelerate integration, localization, and innovation. For more information, please visit www.guidewire.com and follow us on Twitter: @Guidewire_PandC. Guidewire Software, Inc. is proud to be an equal opportunity and affirmative action employer. We are committed to an inclusive workplace, and believe that a diversity of perspectives, abilities, and cultures is a key to our success. Qualified applicants will receive consideration without regard to race, color, ancestry, religion, sex, national origin, citizenship, marital status, age, sexual orientation, gender identity, gender expression, veteran status, or disability. All offers are contingent upon passing a criminal history and other background checks where it's applicable to the position. We’re an extraordinary blend of hungry self-starters, intrepid explorers, brainy experts, and loyal allies. Combine all of this, and we make a glorious success story, loaded with down-to-earth, helpful, and passionate people, all on the journey of cloud innovation and best in class technology. Guidewire’s an adventure—and it’s yours for the taking. At Guidewire, we are utterly committed to customer success. We combine digital, core, analytics, and AI to deliver our platform as a cloud service to the P&C Insurance industry. And with the largest R&D team, services team, and partner ecosystem in the industry, we continually evolve and innovate to meet our customers’ needs. We put our values of Integrity, Rationality, and Collegiality first, harboring a culture of honesty and openness that our people never want to lose. And we each bring a little quirkiness—and a little genius—to the table. As the landscape of our industry continues to shift, we respond with flexibility and skill. We’re braving uncharted territory, pushing past the conventional with our products, partners, and people.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The Data Architect will lead in a leadership capacity, collaborating with customers and partners to implement the Guidewire Data Platform, focusing on designing, implementing, tuning, and maintaining ETL ingestion pipelines according to best practices. Key duties include designing high-level system architecture, leading consensus-building sessions, analyzing legacy data structures, and optimizing Spark jobs to meet SLAs.
Loading...