Hacker News new | past | comments | ask | show | jobs | submit login

> Like, how good of a signal can companies really be getting if it's too different from real world systems?

There are "layers" to the signal you can get from system design interviews:

1. Are they aware of "big concepts", indexing, sharding, queues, scheduling, etc.

2. Are they comfortable actually using and manipulating these abstractions in an academic sense? e.g. new college grads may have never used an index, but can walk me through how we might use one to solve a given problem.

3. Do they have experience operationalizing these concepts? e.g. in DB design questions I love it when candidates are able to walk through a zero downtime migration plan from before and after we add a feature.

4. Related to 3, How do they weight tradeoffs in the system? Do they drill down into product requirements to help inform these tradeoffs?

A system design interview often lets me say: "They're not experienced enough at our scale for me to want to hire them at level X, but I'd hire them at level X-1"

All this said, I do think more companies should try to draw design interview questions directly from their own company experience e.g. imagine we didn't have feature X, how would you add it to the product?




> imagine we didn't have feature X, how would you add it to the product?

I think pulling specific ideas from your actual system is full of traps. These are probably problems you've thought of - both actively and passively - for days, months, or even years.

The person you're interviewing has had no such benefit, and it might be very hard for the interviewer to be able to separate what they think is reasonable and what is actually reasonable.


> I think pulling specific ideas from your actual system is full of traps. These are probably problems you've thought of - both actively and passively - for days, months, or even years.

The first time you give the interview certainly, but you have time to formalize and tune the interview. Write down what you're going to ask, what signals you're looking for, the quality of a solution you expect. Socialize this with engineers who weren't directly involved in the design of the system, ask for their feedback.

You can usually iron out a question in 10-20 interviews and make it really stellar within a 100. I know that's a lot of potentially wasted hours, but the ROI in terms of candidate quality is worth way more.


That part about "write down the question" and the "what are you looking for" is a rather important part that most people ignore and just wing it.

By having that rubric down it becomes easier to make the objective choices (that can be backed up with HR). Additionally, that same question (with the same rubric) can be used by other interviewers for other candidates and overall, you would hope that the same candidates would get the same thumbs up / thumbs down evaluation no matter who asked the question or considered them.


"walk through a zero downtime migration plan from before and after we add a feature."

How does one approach this?


If you're asking about specific examples, this book covers a lot of common patterns I've seen over the years; maybe a good starting point:

https://martinfowler.com/books/refactoringDatabases.html https://databaserefactoring.com/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: