Expedia || SDE-2 || Gurgaon
Summary
I interviewed for an SDE-2 position at Expedia in Gurgaon, which involved an Online Assessment, a project deep dive, a DSA round, and a system design round. I discussed my past work extensively and solved a challenging algorithmic problem along with a system design question.
Full Experience
Round 1 – Online Assessment
This round consisted of 3 coding questions that I had to solve within 90 minutes. The problems broadly covered topics such as Sorting, Sliding Window techniques, and one that felt like a mix of Dynamic Programming or a Greedy approach.
Round 2 – Project Deep Dive
In this round, the discussion focused heavily on my past projects. We went into detail about various aspects of my work, exploring both the technical challenges and solutions. Beyond the technical aspects, there were also several managerial questions. These included how I handle conflict, my leadership experiences, and the most challenging production issues I've successfully resolved throughout my career.
Round 3 – Data Structures & Algorithms (HackerRank)
This was a coding round conducted on HackerRank. I was presented with two questions:
- One question involved determining the minimum operations required to make all elements in an array unique while simultaneously minimizing their total sum. An operation consisted of incrementing an element by 1.
- The second question was identified as a Monotonic Stack problem, which the interviewer hinted was of Medium-Hard difficulty.
Round 4 – High-Level Design (HLD)
The final round was centered around System Design. I was tasked with designing a scalable web crawler, which required me to detail its architecture, various components, and strategies for handling issues like fault tolerance, politeness policies towards websites, efficient data deduplication, and achieving high overall performance.
Interview Questions (2)
I was presented with a problem where I needed to find the minimum number of operations to make all elements in an array unique. An operation consists of incrementing an element by 1. The goal was to achieve uniqueness while also minimizing the total sum of the elements after these operations.
The task was to design a scalable web crawler. This involved discussing the high-level architecture, key components like a URL frontier, DNS resolver, page fetcher, parser, and storage. I also had to explain how to address various challenges such as implementing politeness policies, handling duplicate URLs, ensuring fault tolerance, and achieving high performance for large-scale crawling.