Sprinklr | Lead | 5.5 YOE | Selected
Summary
I recently interviewed for the Lead Product Engineer role at Sprinklr, which involved two coding rounds, a technical discussion with system design, a low-level design round, and a final HR interview. I was successfully selected for the position.
Full Experience
I recently went through the interview process for a Lead Product Engineer role at Sprinklr. The process consisted of four distinct rounds.
Round 1 – Coding
This round focused on Data Structures and Algorithms. I was given two problems to solve:- Minimum number of platforms required: A classic problem often found on platforms like GFG.
- Frog Position After T Seconds: A LeetCode problem which I solved.
Round 2 – Technical Discussion + Design
This round involved a deep dive into my current projects, followed by a system design challenge. The system design question was to design a live user count feature for a streaming platform similar to Hotstar.Round 3 – LLD
The third round was dedicated to Low-Level Design. I was tasked with designing a web crawler that identifies and outputs URLs returning errors. We discussed the basic architecture, concurrency considerations, and various failure handling strategies.Round 4 – HR
The final round was a standard HR discussion. We talked about my career journey, my understanding of Sprinklr as a company, and my motivations for wanting to join them.Verdict
I am pleased to say that I was selected for the Lead Product Engineer position.Interview Questions (3)
Frog Position After T Seconds
Given a tree and a frog starting at node 1, find the probability that the frog is on a target node after 't' seconds. The frog jumps to an unvisited adjacent node, or stays if no unvisited nodes or if it's already on the target node.
Design Live User Count for Streaming Platform
Design a system to accurately track and display the live user count for a streaming platform similar to Hotstar. Considerations include scalability, real-time updates, and handling a large number of concurrent users.
Design a Web Crawler for Error URLs
Design a web crawler system that takes a set of seed URLs, crawls them, follows links, and identifies all URLs that return an error (e.g., 4xx or 5xx HTTP status codes). Discuss concurrency, error handling, and basic architecture.