Ixigo Inerview Experience | Round 1 | SDE 2 | Rejected
Summary
I interviewed for an SDE 2 backend role at Ixigo; the round covered system‑design and data‑migration questions and I was ultimately rejected.
Full Experience
Hi Leetcoders, you can read my interview experience with Ixigo for the role of SDE 2 - Backend.
One thing to note: HR has explicitly mentioned the first round would be DSA, but interviewer asked Redis and Kafka related production scenarios only.
Q1. Design a race-condition-safe user signup system
You need to design a signup flow where:
- Phone number is mandatory and must be unique
- Multiple concurrent requests for the same phone number can hit different servers/pods
- System should not create duplicate users
Interviewer agreed solution: Redis atomic lock
Q2. Efficient Redis access with prefix
You have:
- Millions of keys in Redis
- Only a prefix of keys is known (e.g., lock_phone_xxxxxxxx)
- Need to fetch ~5000 matching keys efficiently
Note: Avoid expensive operations like full scans or regex
Interviewer agreed solution: Redis Sorted Set
Q3. Data migration
System B uses multiple MySQL tables. System A uses a single Mongo collection. Migrate System B's data to System A
You need to:
- Migrate existing (historical) data
- Handle incoming live traffic simultaneously
- Ensure no data loss or duplication
- Maintain backward compatibility during migration
Interviewer agreed solution: Event stream DB logs using Debezium to Kafka topics, use Kafka idempotency to avoid duplicates.
Although I answered all the questions but not exactly what interviewer was looking for.
Result: Rejected
~Anonymous
Interview Questions (3)
Design a race-condition-safe user signup system
You need to design a signup flow where:
- Phone number is mandatory and must be unique
- Multiple concurrent requests for the same phone number can hit different servers/pods
- System should not create duplicate users
Efficient Redis access with prefix
You have millions of keys in Redis. Only a prefix of keys is known (e.g., lock_phone_xxxxxxxx). Need to fetch roughly 5,000 matching keys efficiently while avoiding expensive operations like full scans or regex.
Data migration from MySQL tables to Mongo collection
System B uses multiple MySQL tables; System A uses a single Mongo collection. Migrate System B's data to System A while:
- Migrating existing historical data
- Handling incoming live traffic simultaneously
- Ensuring no data loss or duplication
- Maintaining backward compatibility during migration