Sprinklr | Senior Lead | 5.5 YOE

sprinklr logo
sprinklr
Senior lead product engineer5.5 yearsOffer
December 8, 202577 reads

Summary

I interviewed for the Senior Lead Product Engineer role at Sprinklr, which involved four rounds covering coding, technical discussion with system design, low-level design, and HR. I successfully received an offer.

Full Experience

Round 1 – Coding

I was presented with two Data Structures and Algorithms questions during this round.

Round 2 – Technical Discussion + Design

This round began with a discussion about my current project. Following that, I was given a system design question to tackle.

Round 3 – LLD

In this round, I was asked to design a web crawler with specific requirements, and they expected a working code implementation. The discussion also delved deeper into aspects like parallelizing the crawler, ensuring observability, and robust failure handling.

Round 4 – HR

The final round was a general HR discussion, where we talked about my career journey, my understanding of Sprinklr, and my motivations for wanting to join the company.

Verdict

I was selected for the position.

Interview Questions (4)

Q1
Minimum Number of Platforms Required
Data Structures & Algorithms

Given the arrival and departure times of all trains that arrive at a railway station, determine the minimum number of platforms required for the railway station so that no train waits.

Q2
Frog Position After T Seconds
Data Structures & Algorithms

Given an undirected tree of n nodes labeled from 1 to n, and an integer t, a frog starts at node 1. For each second, the frog moves from its current node to a randomly chosen adjacent node. If the frog reaches a leaf node, it stays there. What is the probability that the frog is on a target node target after exactly t seconds?

Q3
Design Live User Count for Streaming Platform
System Design

Design a system to efficiently track and display the live user count for a streaming platform similar to Hotstar.

Q4
Design an Error-Reporting Web Crawler
System Design

Design a web crawler that crawls URLs and outputs URLs which return errors. The expectation included providing working code. Further discussions covered parallelizing this crawler, ensuring observability, and handling various failure scenarios.

Discussion (0)

Share your thoughts and ask questions

Join the Discussion

Sign in with Google to share your thoughts and ask questions

No comments yet

Be the first to share your thoughts and start the discussion!