Crowdstrike Interview Experience
Summary
I interviewed at Crowdstrike for an undisclosed engineering role in London, which involved three rounds: an initial screening with the Hiring Manager, a coding round, and a system design round. Despite solving the coding problem and providing a comprehensive design for the system design assignment, I was ultimately ghosted after initial positive feedback regarding a potential Engineer-3 position.
Full Experience
My interview process at Crowdstrike consisted of three main rounds. The first was an Initial Screening with the Hiring Manager, where we discussed my current work experience, microservices architectures, event-driven systems, and various scaling and error handling scenarios. This round went well, and I was scheduled for the next.
The second round was a Coding Interview. I was given a standard LeetCode problem: "Given an m x n 2D binary grid grid which represents a map of '1's (land) and '0's (water), return the number of islands." I solved this problem effectively using a Depth-First Search (DFS) approach, which led me to the final round.
The final round was a System Design challenge. This involved a take-home assignment where I had to design a VirusTotal-like system. I prepared a holistic view and went into a deep dive of the system, discussing key components such as file uploads, hashing algorithms, blob storage solutions, database sharding strategies, scaling considerations, and the implementation of asynchronous flows within an event-driven architecture.
After these rounds, the overall feedback I received was positive. The recruiter initially mentioned that they weren't considering me for a Senior Engineer role but indicated that the team might consider me for an Engineer-3 position. However, despite multiple follow-ups on my part, I was eventually ghosted and did not receive an offer.
Interview Questions (2)
Given an m x n 2D binary grid grid which represents a map of '1's (land) and '0's (water), return the number of islands.
Design a system similar to VirusTotal. This involved considering aspects such as file uploads, hashing mechanisms, blob storage, database sharding, scaling, and implementing asynchronous flows within an event-driven architecture.