Hacker News new | past | comments | ask | show | jobs | submit login
Triplebyte expands its recruiting platform beyond YC, signs up Apple, Facebook (techcrunch.com)
116 points by artsandsci on March 9, 2017 | hide | past | favorite | 146 comments



Having interviewed with TripleByte, I think this is a bold and poor decision.

Upfront, I didn't pass a TripleByte interview I had (one of the few companies I haven't passed).

My interviewer showed up late initially, then took a break and showed up 10 minutes late after the break. Further, the interviewer nit picked super irrelevant details, and acted exceedingly smug and condescending. Some of the stuff he told me I was wrong about was related to my research. Even after attempting to explain it several times, he just said, "No, you're wrong, you don't know what you are talking about."

I then literally brought up the paper and sent it to him, before he said something along the lines of... Oh, well I guess that is right.

Overall, it was one of the worst interview experiences I have had, and I don't believe they are good way to recruit. Hell, I even passed all their coding questions with flying colors. It was the silly video conferencing interview with a smug engineer who really made the interview fall apart.


I had an interesting experience with triplebyte which wasn't as objectively bad as yours, but it also makes me skeptical of the company.

First round was multiple choice questions, relatively straight-forward. Second-round was skype-call and just felt incredibly subjective. I was asked questions around building out memcached to support arbitrarily-sized values, and I got the same "smug" vibe you sensed.

The interview style was very "Him: How would you do X?" "me: Well that's not a simple problem, there are a lot of solutions each with tradeoffs." "Him: Okay so name one" "Me: So you could do X" "Him: BUT THEN Y [GOTCHA!]" "Me: Yes, that's one of the tradeoffs of X"

It wasn't clear to me what the heck he was even looking for. Was he hoping I'd list race-condition problems? Had he not even considered race-condition problems? Was he looking for a theoretical solution or a real-world solution? Also he kept going on random tangents ("That brings me to an interesting question, how would you shift a gigabyte of memory 1 bit?"). He seemed very concerned with efficiently bit-packing the header in this problem, which seems silly to me when we're talking about storing gigabytes.

My understanding was that triplebyte was seeking to be the SATs of engineering, however SATs do heavy validation with test-retest reliability and such, I had no particular reason to suspect triplebyte's interview was any more objective than any other company's.


We actually put a bunch of effort into consistency/repeatability checks. Every interview is recorded (video), and we re-watch and re-grade a percentage of them to measure the consistency. A long-term experiment we're running is comparing qualitative scores (code quality, good process, how good did the interviewer feel the candidate was) with quantitative features (which tests passed, how long did it take, what design--picked from a decision tree--did the candidate take). We calibrate the qualitative scores with the recorded interviews. So far, quantitative scoring is winning (when judged against predicting interview results at companies). We're waiting, however, until we can see which better predicts job success.


It sounds like your ability as an interviewer is pretty poor. There are several examples of the same smug behavior. What sort of training have you gone through to ensure that you're actually an appropriate and qualified person to be interviewing?


I'm his co-founder and this comment is unnecessarily personal. Ammon has done over 900 technical interviews (https://www.reddit.com/r/cscareerquestions/comments/5y95x6/i...) and there's one negative reference to him specifically on this thread.


Nah, you can't use forum threads as proof. You need to do the analysis scientifically, which based on the feedback, it sounds like you need to do. You don't bother quantifying the ability of you and your interviewers, you just assume you're great.


I interviewed with Ammon. He did not come across as smug. I went through the process just as a curiosity because I was fed up with the typical interview process. I haven't followed through anything else related to Triplebyte in terms of an actual job in case people think I have a favorable view because I got a job through them.


Wasn't there a study by Google a while back, where they found, as a trend, that their most successful people had only marginally passed their job interview ?


The finding was that people who had received a "no hire" recommendation yet still received an offer tended to do well. The reason being: to compensate for the poor feedback, someone else on the hiring loop believed in the candidate so much and saw something exceptional and were willing to go to bat for them.


That sounds like the Pareto-optimal solution to a job interview. It's like how you're not doing grad school properly if your grades are better than C.

Also, candidates who do too well could be too good for the job, since Google supposedly likes having incredibly overqualified people maintain do-nothing internal apps.


> better predicts job success

How do you rate job success?


being happy with the job and not having left after 1 year


Wait so job success is based on the interviewee, and not the company?


Notably absent from that list is "Are we verifying we're doing a good job as interviewers?"

It doesn't matter how good the interviewer feels the candidate is, or whether a design was picked from a decision tree. All that matters is whether the candidate can do the work at actual companies.

I think people here are reacting to irrelevancies during the interview process -- questions which cannot possibly be reflective of a candidate's real-world competency. (When was the last time you shifted a gigabyte of memory? And even if you did, that's not what companies are going to employ people to do. So why ask the question? Are you sure it isn't trivia?)


Interviews absolutely should be grounded in trying to predict how a candidate would do on a job. That's the whole ballgame. The question is how to best do that. First, you need to run a repeatable process (my previews comment). Second, you need to look at the right skills. The approach we take is to track a lot, and figure out what works the best over time. What we've found to be most predictive (so far) is a base level of coding competency, plus max skill (how good an engineer is at what they are best at). So (beyond the coding portion) we don't actually care very much about what a candidate is bad at. We care about how good they are at what they are good at. To give as many candidates as possible the opportunity to show strength, we cover a number of areas. This includes back-end web development, distributed systems, debugging a large codebase, algorithms, and -- yes -- low-level systems (currency, memory, bit and bytes). We do not expect any one engineer to be strong in all of these areas (I'm weak on some of them). But they all are perfectly valid areas to show strength (and we work with companies that value each of the areas).

We've recently moved to a new interview processes organized around this idea of max skill. It's working great in terms of company matching and predictive ability. However, it seems we may have underestimated the cost to candidates of being asked about areas where they are weak. There's more negative feedback here than we've seen in previous HN discussions, and I think that the interview change may be behind that. I'm taking that to heart. I think we can probably articulate it better (that we measure in a bunch of areas and look for max strength). We're also running an experiment now where we ask engineers are the start of the interview which sections they think they'll do best on. I'm excited about this. If engineers can self-identify their strongest areas, we'll be able to make the process shorter and much more pleasant!

So, the bit shift question: that come up down one branch of a system design question that we used for a while (we've since moved to a more targeted version that is more repeatable). The (sub)issue involved adding a binary flag to a large data blob (this came up as part of a solution to a real-world caching problem). Adding a single bit flag to the front of a 1GB blob has a problem. To really add just one bit, you'd have to bitshift the entire 1GB. This is clearly not worth it to save 7 bits of storage (ignoring that that would not be saved in any case). You can just use a byte (or word), or add the flag at the end. When candidates suggested adding a bit flag at the front, we would follow up asking them how they'd do it (to unearth if they were using 'bit' as a shorthand for a reasonable solution, or if they really are a little weak in binary data manipulation). This was one small part of our interview. By itself it in no way determined the outcome of the interview, or even of the low-level systems section. Plenty of great engineers might get it wrong. But I don't think it was unfair.


> Interviews absolutely should be grounded in trying to predict how a candidate would do on a job. That's the whole ballgame.

This is directly in conflict with this comment from Harj:

> The metric we optimize for is our onsite success rate i.e. how often does a Triplebyte candidate onsite interview result in an offer.

If you want passing your interview to mean that a candidate will pass their onsite interview because your interview is tailored to deliver people who look good in an onsite, you're acting as a recruiter, trying to give companies whatever they say they want. This model provides a lot of value to companies but zero value to candidates, since anyone who passed your interview would have gotten the job anyway.

If you want passing your interview to mean that a candidate should pass their onsite interview because they would perform well in the job, you're acting as a credential, telling companies what they want. This model provides value to companies and to candidates (assuming you can tell who will perform well). These aren't the same model.

Your candidate-directed advertising leans heavily towards the second model ("just show us you can code!"). That makes sense, since that's the model that provides value to candidates. It's disappointing to hear Harj say that what you really believe in is the first model, and disconcerting to see you openly disagree with your cofounder about what your company is trying to do.


It seems a bit disingenuous to accuse them of openly disagreeing - if you assume that the client company's interview is optimized for finding folks who will perform well on the job and Triplebyte's process matches you to the employers who are most likely to hire you, the statements are perfectly consistent. Of those two assumptions the latter seems demonstrably true, while the former is obviously a bit suspect. I don't think Triplebyte is in a good position to change it right now, but maybe some day they will get enough actual performance data to start influencing it.


Given their existing positioning of "the interview process is broken and we're here to fix it", I see no reason to credit them with the opinion that client company interviews are optimized for finding people who will perform well. There is no way to reconcile "the current interview process is broken" with "our goal is to find people who do well under the current interview process".


"The current interview process is broken because strong candidates are excluded due to weak resumes and companies and candidates do a terrible amount of duplicative work just to confirm a candidate is baseline competent."

Assuming that, would you accept the position to be consistent?


> The approach we take is to track a lot, and figure out what works the best over time.

That's really, really bad statistics.


Why? Sure you will get some things that might initially end up looking significant but aren't, but it's not that hard to retest those things from scratch to minimize that.

In general if you're going to provide a critical comment I think it would be better for the community here to expound on it a bit so everyone can understand your argument.


So, the bit shift question: that come up down one branch of a system design question that we used for a while (we've since moved to a more targeted version that is more repeatable). The (sub)issue involved adding a binary flag to a large data blob (this came up as part of a solution to a real-world caching problem). Adding a single bit flag to the front of a 1GB blob has a problem. To really add just one bit, you'd have to bitshift the entire 1GB. This is clearly not worth it to save 7 bits of storage (ignoring that that would not be saved in any case). You can just use a byte (or word), or add the flag at the end. When candidates suggested adding a bit flag at the front, we would follow up asking them how they'd do it (to unearth if they were using 'bit' as a shorthand for a reasonable solution, or if they really are a little weak in binary data manipulation). This was one small part of our interview. By itself it in no way determined the outcome of the interview, or even of the low-level systems section. Plenty of great engineers might get it wrong. But I don't think it was unfair.

Of course it's unfair. The candidate isn't actually programming a solution when they're talking to you. They're on a tight time crunch, under a microscope, in front of an interviewer. The answers to your questions will literally make or break their future with you. Did you specify to them in your original question that the entries in the cache are 1GB large? If you assigned them the task of implementing a solution to your caching question, they would immediately notice using a 1-bit flag is a poor design decision.

The point is:

Plenty of great engineers might get it wrong.

That says quite a lot more about Triplebyte's question than the engineers. A wrong answer doesn't mean they're weak in bit manipulation or that they decide to implement poor solutions. It says they're suffering from interview jitters. They're weak in the artificial environment you've constructed for the purposes of the interview, which may or may not correlate with their actual ability.

This may sound like useless theorizing, but unfortunately a massive number of excellent engineers are awful in an interview setting. But if you give them a problem to actually solve, they pass with flying colors.

Triplebyte does give problems to candidates to solve, but it sounds like you also care about whether they can pass your interview (by demonstrating sufficient max skill when prompted) instead of whether they can implement solutions to the problems you assign to them. This rules out candidates who would otherwise do very well, which is the type of candidate you're trying to find.

I know that you're saying the verbal section of the interview isn't the whole process, but are you sure it's an effective one?

It might be positively misleading. A candidate who is very strong in the area you're looking for is also likely to be someone who will get your questions completely wrong, because they're not programming. They're talking. So it sounds like you're selecting for people who can talk well: those who can show strength during your interview when prompted verbally. Is that the right metric to find talented candidates?

If you were to put together a pipeline where e.g. you give candidates an XCode codebase and say "There are bugs in this codebase, and <specific missing features>. Implement as many fixes or improvements as you wish or have time for, then send us the code," you would have a mechanism which selects for candidates who are ~100% competent, since that's exactly the type of work they'll be doing on a day-to-day basis.

Some candidates wouldn't want to do that, so perhaps there should be an alternative for them. But it'd be vastly more effective than quiz-style questions during a timeboxed interview.

It's possible to come up with endless reasons why it might be a bad idea to set up a pipeline like that. But all the companies that have set it up have been shocked how well it works when they rely solely on that test. Instead of an opportunity to show strength during an interview, the candidate is able to directly answer the question "Can they do the work?"

EDIT: From one of the other comments (https://news.ycombinator.com/item?id=13834231):

> I applied through their project track. It was described as a low-pressure way to write your code ahead of time and talk about it in the interview. The interview was, instead, about making changes to my project while Ammon watched. (Also, there was a request to derive a formal proof while Ammon watched. I didn't get it.) After which I got a rejection saying that my project was great but my interview performance was so poor that they wouldn't move forward.

It sounds like Triplebyte almost has the pipeline described above, but it won't work if you watch the candidate or ask them to do more work. The project alone has to be set up to be a sufficient demonstration of skill.


Well, my objection was much more fundamental. I never suggested packing a bit at the beginning, I suggested using a header of about 20 bytes or so (trivial against a GB). As for the shifting 1 bit question, my complaint with it was that I was in the middle of answering 1 question and then another question was brought up as a non-sequiter. The fact that I was asked it made me assume there must be some answer superior to iterate over all the data, which I don't believe there is.

The other reason I dislike this type of interview question is that if the interviewer never proposes a superior alternative at the end, you get no opportunity to challenge them. How do we know my solution didn't solve problems the other party didn't see?

For context, it seems the "preferred" solution was to build a wrapper around existing memcache. However as a real-world engineer the solutions I was solving for (90% of your clients won't use this wrapper at a company over 100 people, so we want to avoid key collision with people who aren't using this driver) were not the theoretical ones (How could we make the header only 4 bytes!) that the interviewer was evaluating on.

Plus on top of all this, I have no idea if the person interviewing me is unaware that memcached supports atomic increment, knows but doesn't care, or is deeply concerned about preserving this functionality. There are dozens of facets that an individual could consider "important" in this type of problem, and there's no objective basis for most of these concerns without a context of the user (because that's what all engineering comes down to after all).

My guidance to companies in this type of situation is: A great engineer can do 80 hours of work in 15 hours, but not necessarily do 30 minutes of work in 29.


This is all just really hard (and counterintuitive). We've tried take-home projects, with and without asking the engineer to make changes live. This is actually very controversial (a lot of engineer are, reasonably, against the time commitment). And (we found) to get an equivalent level of consistency (the 1st step toward accuracy) the project needs to be pretty big. The work really needs to be of the level of what engineers do in a full day or more on the job. Shorter projects than this introduce the noise of how much time the candidate spent on the project. You can introduce a hard time limit, but then you're back in interview stress territory.

Large take-home projects (and trial employment) totally are better ways to evaluate engineers than interviews. Unfortunately, they require major time commitments from the candidate that many engineers are not able to give. Most (about 80%) engineers select a regular interview if given the choice. I do think they are a good option to offer, but they can't (unfortunately) replace interviews in the majority of cases.

We've also tried debug sections (where we give the candidate a program with bugs in it and ask them to fix test cases). This works great as a portion of the interview (but misses some people with other skills, so it can't be the entire interview).


We didn't find this at all. You guys should feel free to reach out sometime; I'm happy to talk about what we did at NCC/Matasano.


We have found that well-crafted small problems are enormously successful predictors of good hires. Sizing them to require an evening, possibly two, of work is what to aim for.


Studying for in-person interviews is a major time commitment. One that's reusable across employers, sure, but a Triplebyte take home project should be just as reusable as studying for a Triplebyte interview.


Have you guys considered that seeking to quantify engineering skill may be fundamentally at odds with the goal of hiring good engineers?

I'd be interested in seeing the argument for why this is a good thing, beyond the fact that it enables a company like triplebyte to exist.

To me it seems like these concerns always boil down to the same thing: tests that try to quantify something that may be unquantifiable. I suspect this is because it's not cost effective to facilitate a process that digs well beyond engineering trivia.


There are two things to measure - ability to get an offer (easy) and effectiveness on the job (hard). Triplebyte has clearly provided an improvement on the former. Seems like the latter is very difficult to measure, but I don't know why anyone would assume quantifying would negatively correlated with performance - at worst one might assume it is not correlated.


Have you guys thought about trial employment with payment -- that was an idea suggested a couple months ago. I think the upfront cost would be better than having a false positive.


Yeah. Trial employment better be paid! :) Even with pay, unfortunately, folks with an existing job often can't take the time off. And when people have the option between one company making an offer and another with a trial period, they often go for the offer (which makes sense). The other issue is that a company can't do a trial period with every person who applies (too much cost for the team), so there has to be screening step in front, which sort of just moves the problem to an earlier level.

I do think trial employment can be a great thing. But it's not a universal replacement.


Trial employment immediately screens out candidates who do not have the time or patience to spend an entire day working very hard with a prospect. It also introduces an NDA, a need for a sandbox, etc.


How would you actually bit shift 1 gigabyte worth of data? Would you start at the end and read in words? Possibly using some temp variable?


so like every other interview ever?


I'm sorry that the interview went poorly! I'm especially sorry about the lateness. I just pulled up our notes from your interview, and I was your interviewer! So that makes it especially bad :(

I'm not an expert in all areas. At the end of the interview we have a section where we let the discussion go into whatever technical area the engineer wants to talk about. It sounds like you're an expert in an area where I am not. In those cases I try to ask questions and push deep, but (depending on the topic) that can be hard.

edit: removed discussion of the specific topic discussed. Sorry, folks below are right

Expertise is something our model handles less well (it's much harder to standardize). This certainly results in us failing some great people (and it sounds like that may have happened here). I'm happy to talk about this more. Give me an email at ammon@triplebyte.com.


I have to say, I give you props for responding so cordially after I was overtly harsh. I'm sure you could have bashed me, but didn't so thank you.

That being said, I think the best way to improve the process is to try to standardize it by area of expertise. I'm sure that is done to some degree, but I think more pronounced may be better. I think of roles within companies are too static and don't actually represent role(s).

Regardless, best of luck. I didn't mean to be a dick, but I felt obligated to share my opinion in this case.


You are thanking someone for not bashing you publicly? What am I missing? I didn't see anything "harsh" in you sharing your experience.


Based on your other replies, it sounds like the computer vision part of the interview was fine then. Did you struggle with other parts of the interview then technically speaking?


I'm sure I did, although I don't recall any hard fail. Could have been social skills too I suppose. Either way, I feel my original comment was the real gripe I had. It left a bad taste in my mouth and that damages reputations.


It's reasonable to want to defend yourself and set things right, but I don't think you should be sharing specific notes from an interview where the interviewee had a pretty reasonable expectation of privacy. (unless you already got permission?)


ammon seemed like he made a good choice on balancing what to share and what not to share.


I agree with you, he could have been exceededingly harsh. He shared what he did and overall I'm not displeased, it was honest and didn't make me look too bad (which again, he could have done I'm sure)

On the other hand, sharing my info publically is not professional. I shared what I viewed as unprofessional, without specifics regarding interviewer or even questions. He shared my technical expertise and much more specific details regarding my question(s) and interview.

That kind of goes along with my original point, it seems abrasive and leaves a sour taste in my mouth.

Quite honestly, it probably damages their reputation more by responding. Why would someone want to interview somewhere, where they know if they criticize, their interview could be made public.


Sharing anything like this on a public forum is exceedingly unprofessional, regardless of whether the content is positive or negative. If candidates don't have an expectation of privacy, they won't provide honest answers, which drastically undermines Triplebyte's claims about more effective evaluation.

I took the original comment about the interview with a grain of salt(these things are inherently subjective). But it's fairly obvious now that they're as unprofessional as claimed. Hope no one from Apple or Facebook sees this.


Had a bad experience myself (with the obvious caveat that it will seem sour if you don't pass the the interview, and posting anon but will follow up privately for anyone posting contact info).

The challenge was to code up a regex parser in three hours then discuss in an interview.

During the interview I was asked to add (IIRC) a Kleene operator. I repeated back his explanation of what a Kleene operator is. I explained how that definition would impact my choice of how to implement it. During the implementation, I made repeated references to that same spec. I got it working.

Then, he told me that it didn't work, because a Kleene operator means something completely different than what I understood. He apparently wasn't listening the whole time because I repeated back his spec several times when implementing it and he never corrected it!

(Perhaps this was some subtle test of "see how they react to impoliteness"?)

More importantly though, it was rejected for not being an elegant state machine implementation of a parser, which made it hard to extend. Which is fair, in a way. I knew, abstractly, that that was a better way to do it and I would have gladly read up on the concept and written my implementation that way. But with the overhead of setting up the codebase, docs, and tests, I would have exceeded 3 hour limit that they trust applicants to hold themselves to.

Apparently, the right way to proceed here would be to learn state machines, severely exceed the 3 hour limit, and then lie and say it took me 3 hours. Is that what they're selecting for? Or perhaps for people who already know state machine implementations?


I also had a failed interview with Triplebyte that left a bad taste in my mouth. I experienced similarly condescending tones and disrespect from the interviewer. I was told to prepare and share/explain code I'd already written, but when the call started I was instead given three problem options for a live-coding session.

To be fair, they were super friendly and responsive to my critique, and I found their interview notes/follow-up helpful and accurate. Still, if you're going to provide interviews as a service, you should work hard to make the interview a positive experience for the candidate. I'm loath to apply to any company using Triplebyte for now, but I'd probably do it if I really wanted the job.


As far as I know, no company uses Triplebyte exclusively. It just provides a backdoor to onsite interviews with several companies (which was really useful for me cause coming from a non-traditional background I wasn't getting any callbacks from my direct applications to selective companies). There isn't really any downside to giving Triplebyte a go, aside from a few hours of your time.


This was an interviewer working for Triplebyte? I thought their interviews were standardized. Can you tell us more about the experience, and what they asked? I'm super interested in this.


> This was an interviewer working for Triplebyte?

Yes, it was.

> I thought their interviews were standardized.

I don't know if you can standardize peoples personalities... The real issue here (in my opinion) was the guy was not a good interviewer.

> Can you tell us more about the experience, and what they asked?

I honestly don't remember, they asked about active record (as I do some web dev), but also stuff related to describe on the spot some SQL queries and why one would be faster than the other. All of that appeared fine, the real issue was our debate related to computer vision (where my research was done). Sorry, I can't really dive more in, but it was probably a year ago.


> the real issue was our debate related to computer vision

Oof. I feel like computer vision is a particularly misunderstood field. In a way, it reminds me of statistics - I feel like people purposefully obfuscate things to make their results sound more impressive. The great thing about computer vision is you can almost immediately see through the smoke and mirrors - ask for a live demo. I'm amazed how many people refuse to show me a demo of something that is inherently meant to be viewed.


Did you feel like there was a point to the questions they asked you? If you were being as rigorous as you could about designing the interview they delivered (I know your memory on this is hazy), what do you think they'd be trying to learn about you from the interview?

Did they give you homework challenges? Did you clear those but then bomb the interview?


I had the opposite experience with Triplebyte. I found every part of it enjoyable and really liked the interviewer.

What I didn't like was the chats with the companies that followed. It felt like the whole Triplebyte interview process never happened and we were starting from scratch..


My sentiments exactly, although I've heard even worse (not through Triplebyte) - people put through 6! onsite interviews, so I can't help but feel like I circumvented something. I think though that this is to be expected, because Triplebyte is selling an alternative narrative to the hiring and recruiting folkways of the Valley and many companies will be trying to fit a square peg into a round hole around what Triplebyte is doing.


This is my opinion as well. If it doesn't eliminate the onsite interviews at target companies, then there's no point using them whatsoever.


I don't understand this. Do you expect to get hired at companies without ever getting interviewed by them? Triplebyte culls out as much of the process as you can realistically expect to be culled i.e. everything before the final onsite interviews.

For me personally though the main value was in getting contacted by companies that wouldn't have otherwise contacted me. I would have been willing to go through technical phone screens for those companies if needed, but not having to do so was an extra bonus.


> Triplebyte culls out as much of the process as you can realistically expect to be culled i.e. everything before the final onsite interviews.

Which is what? A short chat with the recruiter and an 1 hour phone interview.

With TripleByte instead you get something like, an online questionnaire, 1 two-week project and one 4 hour video interview. (Please correctly me if I don't have it exactly right).

How is this an advantage if you still have to do a call with the recruiter/company, and then on-site interviews?

> the main value was in getting contacted by companies that wouldn't have otherwise contacted me

That alone is a solid reason to use Triplebyte.


You indeed don't have it exactly right :) There are no 1 two-week project and our final interview is 2 hours, not 4.


Oh, apologies!

I do remember doing a two(or maybe one) week project, and could swear that my final interview was more than 2 hours. Maybe the process changed since I last interviewed?


> Do you expect to get hired at companies without ever getting interviewed by them? Triplebyte culls out as much of the process as you can realistically expect to be culled

Almost all other fields are capable of trusting third parties to certify competence. They interview for fit, but with a basic understanding that the candidate is qualified.

Employers waste less time giving interviews, and benefit from a more accurate filter than they could develop in-house.

Employees waste less time giving interviews, and will find out early (i.e. by failing college classes, prompting them to switch majors) if the work isn't for them. Good practitioners are unlikely to be denied their dream jobs by the unreliability of the interview process, and truly incompetent practitioners aren't allowed to keep trying until something sticks because (below some threshold) they'll never get in the door. They're given the benefit of the doubt for making it through the main gate (undergrad/bar exam/whatever) and from there, judged on the basis of their performance in their actual role, not how well the study for a song-and-dance routine that approximates the role.

Triplebyte's value proposition is to be such a trusted third party for software engineering, since universities aren't. If companies don't trust it, and view people who passed Triplebyte with the same "incompetent fraud until proven otherwise" lens they apply to everyone else, then Triplebyte isn't adding value for anyone in the ecosystem.

I think you can realistically expect the skills test to be culled from interview processes: in other fields, it never even appeared.


>Almost all other fields are capable of trusting third parties to certify competence.

Employers in pretty much every other field complain that credentials are close to meaningless. As a result, they heavily emphasize work experience and demonstrable work accomplishments in hiring. This creates a problem for newcomers who have difficulty bootstrapping experience. Software engineering is unique in that you can at least get some idea of a person's ability through a day of interviewing (whiteboard interviews may not be a perfect measure of competence/ability, but they're way better than what is available in most fields), and don't need to use experience or credentials as a gauge of ability (though they use them as screening criteria to sift through the many applications they get, which is where Triplebyte comes in handy). The fact that employers directly assess the skills of job candidates is a benefit, not a drawback, for the field of software engineering.

>Triplebyte's value proposition is to be such a trusted third party for software engineering, since universities aren't. If companies don't trust it, and view people who passed Triplebyte with the same "incompetent fraud until proven otherwise" lens they apply to everyone else, then Triplebyte isn't adding value for anyone in the ecosystem.

Triplebyte is adding clear value to the ecosystem by finding candidates who would have otherwise fallen through the cracks and never gotten an interview in the first place, and matching them with potential employers. In my case, for example, Triplebyte has provided value to both me and Asana by matching us up together. Because of my nontraditional background, had I applied directly to Asana, my application would likely have never made it past their resume screen, and they would have missed out on a great candidate and I would have missed out on a great opportunity.


lol. It's funny how TripleByte claims no whiteboarding no interviews yet at the end you still have to be vetted the same way. Why would anyone use TripleByte then? Do you enjoy being tortured via whiteboarding?


It's like skipping all the phone screens for X companies with a single interview.


Almost exact same situation here. I was rejected by TripleByte and ended up getting offers at many of the companies I applied to. I had the same feelings about the interviewer as the parent comment.

I wouldn't say it was particularly bad, but it was worse than average compared to my other interviews.


For what it's worth, I had the video conference part go poorly. For one I was really nervous with the interviewer after they exhibited smugness and I was sure the thing went south from that instant. Funnily enough the person claimed they weren't fully awake and needed coffee and what not; not sure if it was because of an early morning call, but 10am PST isn't really early, especially not when you setup an interview in advance. I passed interviews at a couple of other places so while this incident is attributable to the interview-loop variance I'm skeptical of spending time with their process if that's the case.


Don't worry not passing it is probably for the best, as if you interview through companies with them, good luck trying to get reimbursed for your interview expenses. They promise that they will handle everything and it will be super "convenient", but it's only convenient if you are fine with getting nothing in the end. I am somewhat convinced part of their business model relies on saving money by not paying people.


We did not reimburse you for expenses?! Can you email me: ammon@triplebyte.com


>"he just said, No, you're wrong, you don't know what you are talking about."

Ouch. That is truly awful. I'm sorry that happened to you. The irony, their site claims:

"The existing hiring process is broken. We’re building a new kind of interview that evaluates tech skills, not credentials."

If that's their mission there's really no excuse for this. I will be sure to avoid them. Thanks for sharing.


Interesting, I interviewed with one of the cofounders and felt like it was one of the best interviews I had. Lengthy, moderately difficult, but fair. I passed, but remember feeling that way even immediately after the interview.


As someone who went through the Triplebyte interview process not long ago, I actually had a positive video screening experience with Guillaume as my interviewer. He was definitely focused on getting as much signal from the interview as possible in the time we had (which is a plus imo, since I don't have tons of spare time to waste). IIRC I didn't complete 100% of the coding part, but he seemed genuinely interested in how I was approaching things and the underlying algorithmic concepts more than in the nitty-gritty of the code itself.

That session was overall certainly energy-intensive, but then no more than a good interview session with someone who knows what they're doing interview skills wise. But at a more general level only Triplebyte would know about interviewer variance and whatever secret sauce they have to maximize SNR during the screens, so I can't speak about that.


I was gonna ask: was it ammon? And sure enough, yes it was.

Ammon is a smug dude, no doubt about it.

Edit: to be clear, I passed the interviews.


I had a great experience with them ~5 months ago. Very professional and smart... (fwiw I passed the interview)


I avoid Triplebyte like the plague. I passed their "online interview" and was marked as exceptional.

I mailed the guy (forgot who he was, but his email address showed up on one of the pages) since I had some questions. He responded once and never again.

Then they had the audacity to email me again a few month later to schedule a call to give feedback for their platform.

Yeah sure, I am going to take extra time from my day to provide you feedback for no return. If they had sent it as a web survey, it would have been probably better.

Finally, they can only schedule interviews on two days a week. Really?


As a general reply to a variety of the complaints in this thread: don't underestimate the task that triplebyte has in selling you to companies, especially since they have focused on candidates who might not even get an onsite interview at companies on their own. Their process also involves helping candidates skip a lot of the interview pipeline at several companies at a time, so that you can get better offers. This means that companies have to place a lot of trust in their assessment, and my personal experience (having gone through the process and gotten a job through them) is that they have to be very very rigorous to sell companies on their model.

My interviews with triplebyte were definitely some of the hardest ones I had during my search, but they also focused on things I consider important in developing software in a way that no other interview pipeline I've been through has. It may be that my overwhelmingly positive experience with them is not reflective of most, but I think that's unlikely given the amount of work they put into standardizing their process.

Being on the side of engineers in the hiring process does not mean they can take it easy on interviews, and my experience would suggest that whatever personal intensity you may feel in an interview with them is very likely to be outclassed by any decent sized sample of interviews at a big company.

I'm definitely bullish on their model, and while the process isn't perfect I'm really excited to see them moving the needle on interview quality in the industry. I sincerely wish them the best with the expansion.


I'm in Triplebyte's pipeline right now, and have had nothing but a great experience. The interview questions have been... not unusual? If people are complaining they're too challenging, I think I must have gotten lucky, because they seemed fairly normal to me. I believe my interviewer even told me that rather than finishing the whole programming problem, they were looking to see how far I could get and how I develop code.

It's been a great experience so far, even though several of my favorite companies in the pipeline have gone incommunicado before I could even talk to them :-/!


[deleted]


You personally may not enjoy skipping the technical phone screens but that doesn't make it suboptimal for most people. If that were true, we'd see no demand for applying through Triplebyte versus applying to companies directly. Empirically, that doesn't seem to be the case, most candidates prefer to not repeat phone screens across multiple companies when job searching.


That's a chop shop move.


The user who posted this comment regretted it and asked us to delete it, so we've done so.


It's been interesting working with larger companies through Triplebyte. When we started, we were mostly working with smaller startups making early engineering hires. The challenge for these them is letting engineers know the company exists and why it's interesting.

Bigger companies like Facebook don't need to let engineers know they exist, a huge number of engineers have already applied to them at some point. What they've realized is the way to hire more is to find the good ones that are sitting in the resume review stage of the hiring process, not getting looked at because they don't have stand out credentials.

One thing that's struck out to me in a surprising way about the average startup hiring process compared to larger companies is speed. We've found the larger companies are actually faster to move on the first step of booking a call to speak with our candidates. This completely reverses when it gets to bringing someone onsite and making offers though. Bigger companies take longer to do both and this is where startups have a hiring advantage.


I had relatively quick (1 week time turnaround) onsites with "big companies" thorough triplebyte. Startups were relatively quick to 'weed me out' as someone with less experience. One of my frustrations was that I was not sure if triplebyte was effectively communicating my junior engineer status to the big companies; one of the companies was pitched to me as a company that had a history of being ok with taking on someone and indoctrinating them in their way of doing things and then turned me down with a reply that they were looking for a senior engineer.

In retrospect I kind of realize that Triplebyte is really more targetting as a service for more senior positions (at least for now) but I think this is a great move for Triplebyte to get a bigger marketshare and wish them luck.


A year or so ago you guys published the post Who Y Combinator Companies Want (http://blog.triplebyte.com/who-y-combinator-companies-want) and A Taxonomy of Programmers (http://blog.triplebyte.com/a-taxonomy-of-programmers), both of which were quite interesting and insightful in terms of data.

How do the types of engineers Apple or Facebook want compare to that?


So, those are not (unfortunately!) the primary way that we match engineers with companies. It comes down to questions about how important it is that engineers are strong in academic CS (some organizations think very, others think it's useless fluff), how important programming speed is (some companies think very, others want to see careful testing and think speed is sign of shallowness), or whether it's important that engineers are strong in http / web systems (some think this is the core of what most engineers do, others want problem solving and intelligence and think web programming is teachable). Some companies want the best engineers, regardless of english level. Others value communication over engineers. It's simple stuff. But getting the data is HARD (companies themselves don't know how much they differ), and matching engineers on these criteria significantly boosts pass rates.


That sounds excellent. What's the best way for an engineer to share their own values with respect to these attributes? For example, is there a way for me to say, "I prefer cultures which emphasize code quality ahead thoroughness and with an above average communication style"? (I haven't used Triplebyte yet.)

I totally understand if this is part of the secret sauce and the answer is "just use it".


The taxonomy is interesting, but flawed.

What founder wouldn't want a Child Prodigy on their team, someone who's 'going to found a company when they're older'? Someone with hustle and chops but happy to be a junior member.

Likewise who'd want an 'idiosyncratic' Academic? Ignoring the fact that massive tools like Scala are some academic's project. Or that Google was founded by two PhD students...

Obvs the former is going to get many 'yes please' and the later 'umm, maybe'.


> What founder wouldn't want a Child Prodigy on their team, someone who's 'going to found a company when they're older'? Someone with hustle and chops but happy to be a junior member.

I think this is the same trap as the old "smart and gets things done" post from Rands. Basically it just implies the programmer reading it is a super-genius, so now they'll want to keep reading your blog in case you compliment them some more.


>What they've realized is the way to hire more is to find the good ones that are sitting in the resume review stage of the hiring process, not getting looked at because they don't have stand out credentials.

Yup, I was one of those people whose resume kept getting sucked into the review black hole due to lack of stand out credentials. I understand companies are swamped with applications and need some way to screen applicants even before phone interviews, but it was rather frustrating knowing I would do perfectly fine if I only I were given an interview.

Triplebyte was an immense help here (just today I accepted an offer from Asana that I got through Triplebyte). I'm really a fan of your philosophy and what you're trying to do to improve tech hiring, and strongly recommend people (especially those without great conventional resumes) to give Triplebyte a go.


Why do you think that is? Dedicated staff for the early funnel compared to "not my real job" specialization further down?


How does the matching process work on the other side? How do you figure out the correct interview loop and position to match a candidate with when you're working with large companies that have so many different groups inside them?


This differs substantially by company. Some companies have one general interview process, and handle team/role assignment after making offers (for some reason this is common at very large companies and very small companies). Other companies (often midsize) have different interview tracks for different open roles. In that case we just treat each interview track like a different company. We gather an initial guess about what the track wants by talking to the hiring managers, and then refine this with feedback data on candidates. We've found that bringing up example interview questions and asking if it's important that candidates can answer them is the best way to get at what skills an organization values. There's still a lot of noise in this (a hiring manager can't speak perfectly for the the entire organization), so we refine it over time.


When I interviewed with Triplebyte I had applied to actually become part of their staff. I was told I'd go through the standard process and then afterward we'd talk about what Triplebyte does in-depth and what the responsibilities would be. Guillaume was kind, engaging, and interested in how I approached a problem. The Australian dude (Buck) took over toward the end and he was very dismissive - he didn't seem to care what I said. It was kind of a let-down because I had crawled his online presence when researching Triplebyte and I was interested in asking him about his charity contributions. I had a good time talking to Harj about buckling-spring keyboards like my beloved Model M in the in-between moments.

I found out later I didn't pass the interview because my solution to the coding exercise didn't work at the end of the hour. We had continued past it and I thought I was safe because I was complimented on its organization and straight-forward interpretation. I finished it that night:

https://github.com/blitmap/coffeescript-snippets/blob/master...

https://github.com/blitmap/coffeescript-snippets/blob/master...

I had fun but I felt like they got more out of it than I did.


I'm a recruiter that is technical (js engineer) based in NYC. I don't see what TripleByte is doing that is that different from what a good recruiter can offer. Matching at very selective companies doesn't seem challenging.

I don't work with Apple or Facebook, but in the past four months recruiting part-time I've had 15 placements. Over 65% resume submission to on-site rate. And over an 85% offer close rate (close counts if I get 3 offers for someone and they choose one). I can match relatively well without needing to put candidates through a day of tests and I save candidates time by sending them to selectively chosen companies (safety, fit, reach). And I spend a ton of truly understanding each unique process of the companies I work with.

What makes Triplebyte actually unique? What are they changing about the industry? Are they actually reducing bias or is this a gimmick? Are they that different from a recruiting firm with strong lead generation?


I interviewed and was rejected from Triplebyte. It was pretty miserable. Compared to real recruiters, I feel more comfortable talking to recruiters since they care about my background and won't ask me to program an hour's worth of code on the spot or ask me several questions about redis and webscale technologies. They're kind of just filtering through candidates looking for people who they think could work at Google instead of looking for hard-working people.

With that said, do you accept new candidates for the NYC area?


> They're kind of just filtering through candidates looking for people who they think could work at Google instead of looking for hard-working people.

Perhaps just responding to their customers (employers) desires. It seems companies these days, companies that are not Google, are looking for Google-caliber people even if they just need someone who knows how to code and is hard working.


Shoot me an email. Happy to chat.


I'm not sure exactly how your process works, from that description there's three ways we differ:

(1) We don't submit resumes to companies. We give them a profile of the candidate which describes their technical skills and their work history but without any mention of specific schools or companies. For a company to move forward with a Triplebyte candidate, they have to trust in our screening process more than credentials.

(2) Companies agree to not do any technical phone screens or coding challenges with our candidates. Instead they do an initial pitch call and then move to an onsite if there's candidate interest. This saves engineers a lot of time spent in repetitive phone screens.

(3) The metric we optimize for is our onsite success rate i.e. how often does a Triplebyte candidate onsite interview result in an offer. Since our candidates don't go through the regular technical phone screens, any improvement we can make on the industry standard of a 20 - 25% onsite success rate saves the company engineering time spent doing those phone screens. Currently we're averaging 2x that rate across all our companies.


> The metric we optimize for is our onsite success rate i.e. how often does a Triplebyte candidate onsite interview result in an offer. Since our candidates don't go through the regular technical phone screens, any improvement we can make on the industry standard of a 20 - 25% onsite success rate saves the company engineering time spent doing those phone screens. Currently we're averaging 2x that rate across all our companies.

I can see where that's valuable to the companies. What value are you trying to provide to candidates? I already know how to apply to companies, pass the phone screen, and wash out of the interview. If I knew how to pass the interview, I'd still know how to apply and pass the phone screen. What is Triplebyte supposed to help with?


If you are someone who behind the veil of ignorance can pass phone screens, then you will wash out 75% of the time through normal interviews. Your typical time cost is 4x offsite process + 4x onsite process to generate one interview. If you use triplebyte, assuming their interview is equivalent to a full interview cycle, your time cost is 1x offsite process + 3x onsite process - one for them, and two onsites with companies since your success rate is doubled. If you want to get more than one offer, the time savings increase further (and faster).


My success rate isn't doubled. Per Harj, they filter for people who will pass onsite interviews. My success rate at interviews is the same whether I go through Triplebyte or not.

> If you are someone who behind the veil of ignorance can pass phone screens, then you will wash out 75% of the time through normal interviews.

Veil of ignorance, huh? I'll run down the ways I've successfully gotten a job:

- Amazon (CreateSpace), by winning a contest they hosted. Multiple times. There was also an interview for this, though not of a problem-solving nature.

- eBay (Milo), by passing their online hiring challenge. There was no in-person interview for this; rather, they had me come in and work for a day.

- NCC Group, by passing their two challenges. There were in-person interviews for this too, largely consisting of them asking me if I knew how to do things and me saying "no". (I was told afterward that the reason my interviews had gone so oddly was that I had never provided them with a resume.)

Triplebyte themselves told me that I was exceptionally strong in "academic CS" -- the first time around. When they asked me to reinterview for their benefit, they highlighted it as a weak point.

My rate of success in applying to companies that rely on an interview instead of a project or other objective demonstration is 0%, not 25%. But I feel safe in saying that my interviewing problem doesn't lie in the fact that phone screens are hiding my basic incompetence from innocent companies.


One of the things they do is data-mining so they can match up candidates with companies where they're likely to perform well onsite (since different companies emphasize different things, Triplebyte can look through data on how their previous candidates have faired and find which companies will value your personal strengths): http://blog.triplebyte.com/triplebyte-engineer-genome-projec...


Harj- thanks for the thoughtful reply. Do you disclose numbers about what % of candidates that get offers choose to take them? I feel like that's pretty important for companies when considering time they save.


the draw they have for engineering talent side is that you do preliminary interview with them and then move straight to on sites with companies they recommend you to. It means you can have 10 on sites without having to do 10 phone screens


This sounds like it's optimizing for the wrong component of the interview process. I've never had issues with spending time on phone screens because they are one of the main ways in which I screen companies I'm interested in.

It's the technical interviewing portion that's a pain to have to re-do over and over again. Especially if it involves travelling across the country to do. Engineers are ultimately looking at company and engineering culture to choose between.

The other thing is that for some engineers, they might perform well in one on-site versus another for many reasons such as the questions asked, interviewer rating, or something as trivial as mood. Seems like Triplebyte giving people one-chance makes this difficult.

Ultimately, I feel the main crux of hiring/interviews/finding the right talent is training. If the industry is over-fitting on people who can pass whiteboarding, then why aren't there more startups focused on this aspect? Not just passing interviews (e.g. outco.io), but actually focused on training systems design and algorithms. Universities don't do that in undergrad or grad school.


I very much agree with all of this. To me, the main draw of Triplebyte was "no whiteboarding." I suck at whiteboarding, so I went through the project track. And, yes, there was no whiteboarding, but what do they replace it with? Live coding. Yeah, like that's going to go any better. I'd have been better off at the whiteboard where I could at least fudge the syntax a little.


(Disclaimer: Applied and at current job through Triplebyte)

Seconded. I don't use LinkedIn or recruiters. When I was looking, I was applying manually to several companies, and setting up phone screens and so on was very exhausting and timing things was complicated. Triplebyte allowed me to combine the primary stages for a couple of companies and helped a lot with prep.


My overall feeling is mixed. Given the time commitment, if all you get to bypass is the initial phone screen of ~1 hour, that means you need to go on 4-7 on-sites to break even. On the other hand, some of those on-sites might be from companies that wouldn't have even phone screened you. It's really hard to say whether it's worth it or not from the candidate side, IMO.


Hopefully someone good at matching doesn't need to send you to 10 on-sites to find you a good fit. That sounds exhausting and unproductive.


We've never had a candidate do 10 on-sites nor would we encourage someone to. That'd indeed be incredibly exhausting. We encourage them to be broad with the number of companies they do an initial pitch call with, then be selective about who they move forward with to an on-site.


Wow. Do you actually find that less than 10 on-sites is sufficient to obtain at least one offer? A >10% success rate seems very high to me, even with the vetting you provide. Admittedly I have only anecdotal data.


They are indeed reducing bias, and finding value where others aren't, such as in candidates with non-traditional backgrounds.

When I interviewed with TripleByte I had just come out of a 3-month bootcamp, and spent the prior 3 yrs as a teacher. Most companies did not look twice at my resume and it was very tough to break in. TripleByte didn't look at all and judged on ability instead of credentials. I haven't seen that from other recruiting / sourcing organizations and give them a lot of credit for it.


My experience with TripleByte is worse than I have had with recruiters emailing me.

I tried going through TripleBytes process and answered the programming quiz. Then TripleByte changes their policy stating that I must do a project before I can proceed. I stopped the process there.

Then after awhile when I try to login and try again after a few months it says that I am in some kind of pending state which I can't get out of. I gave up on the service after that.


> The screening process happens background-blind

I don't get it. If someone's done has an incredible body of work behind them, why do you want to hide that from yourself?

Isn't it a great way to pick the best people out of the pool of applicants? 'Ah this person designed the new IR in Google's V8 - we should definitely talk to them'.

When I speak to potential hires the first thing I ask is 'tell me about the projects you've worked on - what have you built in the past'. Am I doing it wrong? Someone could be great at general programming and pass a coding test, but if they have no experience in my field what are they going to do for me?


Having designed the new IR in Google's V8 is a very strong signal. The problem is that it's vanishingly rare to encounter someone who did that. The much more common case is companies just giving strong preference to anyone who has worked at Google, or graduated from a top 10 school. Again, these are real signals (the average skill of folks who have worked at Google or graduated from MIT is higher than the population at large). But, it's a very crude signal (there are plenty of bad ex-googlers, and great folks who went to a state school). In fact, because the vast majority of programmers don't have credentials, there are almost certainly more strong programmers without credentials than with. For that reason, we choose to specialize on directly identifying programming skill, without using credentials.

tl;dr Credentials carry signal, but recruiters at companies have gone overboard, giving that weight above all else. We push against that.


Previous work and achievements definitely carry valuable signal - we don't disagree with that. And relying on them makes sense for companies where the time of hiring managers is constrained and you need a quick way to identify the (likely) best people within a population.

Relying solely on those signals though acts to the detriment of skilled people whose best work hasn't been done at prestigious, name brand companies.

Our approach to removing credentials from our screening process is to prevent ourselves being biased by them and forcing ourselves to build a process that can find strong engineers who don't look good on paper. This is a win for the companies we work with as it expands the pool of talent they can hire from.


But for some things either you know how to do a job or you don't.

If I need someone to build me a compiler, there's no point sending me any number of job applications who did brilliantly on a coding test, if they have never built a compiler before.

(Of course sometimes it's great to build someone up from scratch, but you can't do that for an entire team all the time).


We work less well with very specialized positions. However, the approach of identifying strong general programmers, and then matching the ones who want to work on compilers with compiler jobs works surprisingly well (still often results in higher offer rates than the approach that companies themselves take of filtering first on compiler experience, and then checking technical strength)


> We work less well with very specialized positions.

Ah, all makes sense then.


>If I need someone to build me a compiler, there's no point sending me any number of job applications who did brilliantly on a coding test, if they have never built a compiler before.

Ok, then shoot me an email, and we'll build the compiler. I'd be happy to find a position that actually involves compiler work at all.


I think it depends on who you want to hire. Triplebyte does an excellent job of finding people who would otherwise fall through the cracks. For example, as someone without a traditional software engineering background (physics degree from a good state school and work experience in rocket science), I didn't know many people in the programming industry who could provide referals, and my direct job applications weren't getting through the resume screens.

Triplebyte has been really helpful for me there, and got me interviews with companies that care more about problem solving ability and are happy to hire someone like me who will learn on the job.

Someone who designed the IR for V8 would no doubt have no trouble directly getting interviews with whatever companies they want, and people look you who screen applicants based on experience shouldn't have much trouble finding such applicants. General problem-solving / coding ability on the other hand is not quantifiable on an ordinary resume / linkedin, and Triplebyte's system provides a good screen (within the limitations of a few hours of testing) to quantify that.


I think it's more about hiding college credentials than hiding the portfolio.

It would be great to allow programmers to show their portfolios, but that would likely enable interviewers to google them, giving away their backgrounds.

I have a few interesting projects in my portfolio that I am proud of, a reasonably high CS GRE score (a test which is unfortunately not administered any more), and a moderately good topcoder history. However, after being told by an HR person that they would not hire me because of the college I went to (a for-profit college) I started to view the relation between hiring and credentials as illegitimate. So for that reason I hope TripleByte catches on.


TripleByte only gets paid when their candidates go through technical engineering screens, so it's not surprising that they select for that to the exclusion of other factors. If you can't talk your way out of a paper bag, you spend TripleByte's time and reputation for no good gain.

Plus, people with a good body of work need less help getting interviews. TripleByte has less value add in that case.


I failed the 3rd round interview because I hadn't studied garbage collection. C# handles it for me, I could have learned, I just hadn't prioritized that yet and folks want to say now you aren't very senior. I was told I could apply again, but haven't been offered a chance to even after a couple of follow ups. The problem that ultimately stumped me in the 3 round was the exercise of drawing a spiral in a console window. Sounds easy, but when you are timed and nervous, almost impossible. After I bombed the 3rd round knowing the rejection was coming, I ended up successfully implementing it in Excel and VBA. Good coders are like a snowflake and I think TripleByte doesn't have a business if they buy that.


I had an awesome video-interview with Triplebyte in January but they didn't decide to move forward. No complaints though as they gave a solid heads-up on what to expect, my interviewer was super positive (I felt like he was actually rooting for me), and they gave me solid feedback on my strengths and what concrete actions I should take if I choose to reapply in the future.

These guys are the real deal, so it's pretty neat that they're expanding out to big-names! I wish them the best!


> they gave me solid feedback on my strengths and what concrete actions I should take if I choose to reapply in the future.

I applied through their project track. It was described as a low-pressure way to write your code ahead of time and talk about it in the interview.

The interview was, instead, about making changes to my project while Ammon watched. (Also, there was a request to derive a formal proof while Ammon watched. I didn't get it.) After which I got a rejection saying that my project was great but my interview performance was so poor that they wouldn't move forward.

I complained that this wasn't actionable feedback, but the only way they ever responded to that complaint was "I stand by that", from someone other than my interviewer. Am I wrong to consider "you do poorly in interviews" hopelessly vague?

They contacted me, much later, to ask me to be a test subject for a new interview. New interviewer asked me about hash tables, and I responded to his questions with this information:

- Hash tables are the generalization of an array to being indexed by "whatever you want" rather than an integer; they have similar performance characteristics to arrays.

- If a hash code is larger than the size of your hash table's backing array, you would generally handle that by storing the item at index (hash_code % array_size).

- When two objects have the same hash, one strategy is to store them in "buckets", linked lists of everything present in the table at that hash key; another strategy is quadratic probing (where when the index you want is full, you repeatedly square the index until the space you're looking in is empty). Quadratic probing has the downside that when you delete an entry from the table, you have to leave a placeholder in the backing array saying "something used to be here".

- If a hash table gets too full, you generally create a new backing array of double the size and rehash everything into the new backing array. The size doubles rather than increasing by some constant amount so that the amortized time requirement for inserts will be constant.

- Amortized time complexity for a set of operations is the average time complexity per operation (not in expectation, but as measured after the operations have happened).

He also asked me about red-black trees. I could say that red-black trees were a self-balancing binary tree with the property that the length of any path root-to-leaf in the tree was within a factor of two of any other path, and that I wouldn't be able to write a red-black tree off the top of my head.

New interviewer, believing that I would be interested in reapplying to triplebyte, did give me feedback on what concrete actions I should take in order to do so. Specifically, he said I should focus on studying red-black trees (OK, fair enough, I guess) and hash tables. I thought I had pretty good coverage, purely within the interview, of hash tables. Is that wrong?


New interviewer, believing that I would be interested in reapplying to triplebyte, did give me feedback on what concrete actions I should take in order to do so. Specifically, he said I should focus on studying red-black trees (OK, fair enough, I guess) and hash tables. I thought I had pretty good coverage, purely within the interview, of hash tables. Is that wrong?

Your hash table knowledge is perfectly fine. So is your red-black tree knowledge. Roughly zero percent of programmers implement those during the course of their job, so knowing their characteristics is more than enough.


My question was more along the lines of "there's a lot about red-black trees that I could know, but don't. What does he expect me to know about hash tables that I hadn't already told him?"


It's helpful to remember that nobody knows how to hire good candidates (it's an unsolved problem) so everyone has their own favorite, ineffective techniques. This results in a lot of puzzling behavior, like being told you need more red-black tree knowledge.

There's probably no way to know what they were looking for, short of asking them directly. But you have at least two options: (a) roll your eyes at anyone who tells you that you need more knowledge about red-black trees or hash tables to be an excellent engineer, or (b) realize it's a game, and play it with a passion.

Both routes are perfectly valid, and personally I prefer route (a). But if you're deciding to do route (b), you could study as much as you can on the subjects, quiz yourself on various trivia related to red-black trees, look up related interview questions, etc.

This is a bit of a tangent, but from a motivation standpoint, I've found it's optimal to think of interviews as a lottery ticket with 10% chance of winning regardless of your ability, rather than an as an obstacle that can be failed due to lack of ability. There's no reason to be discouraged when someone rejects you in a world where people reject engineers for reasons that are essentially random.


I had a very similar experience to your first interview. I suspect we even choose the same project.


>"Also, there was a request to derive a formal proof while Ammon watched"

You were asked to drive a formal proof? Why exactly? How often is that asked at an actual tech interview?


The exercise was to write a regex parser. He asked me about worst-case behavior, which is an exponential number of states. He asked for a regex with worst-case behavior, which I could provide, mentioning that I knew this because the textbook I used for the project pointed it out. Behold, a worst-case regex template:

    (a|b)*a(a|b)^n
This requires at least 2^n states in the DFA.

He asked me to prove that 2^n states were required while he waited. I didn't know the proof and wasn't comfortable trying to produce it while being watched. There's no upper limit to how long producing a proof might take.


Yeah thats really awkward and an odd thing to ask a candidate to do while you wait. I will be sure to avoid these people.

Out of curiosity what book is that DFA problem from?


I used the dragon book (2nd edition) - https://www.amazon.com/Compilers-Principles-Techniques-Tools...

That proof isn't a problem in the book; it's mentioned, with a hint for those inclined to derive it themselves, in the running text.


Thanks


Not sure what book the above poster used, but Sipser is an excellent book for learning about DFAs.


CURRENT CANDIDATE REVIEW

First Round: Online quiz: pretty easy, very intuitive questions. Different languages, but concepts are language-agnostic.

Second Round: Onsite (I live close by). One OOP coding question, Lightning round of very basic college CS concepts, and debugging session.

First Round Matches: The company tries to match you with 10 companies; I got 5. All the companies were either YC or top-vc funded. I received responses from 3 companies; the startups move very quickly -- onsite within a week. However, I've interviewed with 2 of the 3. Did not get an offer from either (qualification and mismatch in culture-fit).

PROS: - The team moves VERY fast when it comes to companies (really tough to think of PTO excuses) - Responsive - each candidate is assigned to a single talent manager - After the initial matches did not fit my liking or yield any results, my talent manager immediately started looking for new matches.

CONS: - Initial matches were not what I was looking for/interested in. (hopefully being fixed).

So far, the experience has been positive. Personally, the lack of stress of finding places to submit to is a huge plus.


Yep signing up more companies to work with is how we can improve the likelihood we'll find enough matches that are both a technical skills fit and interest fit for every candidate. That's what we're working towards.


[deleted]


The user who posted this regretted it and asked us to delete it, so we've done so, and also deleted their name from Harj's reply.


I'm sorry you had a negative experience and we're working hard to make sure it doesn't happen for anyone again.

We started the company to help those people for whom manually submitting their resume to companies isn't an option. We don't use resumes as part of our screening process because we're looking to find skilled engineers without elite resume credentials. For them, submitting their resumes directly results in either silence or rejection.

We've been able to find many of these engineers jobs at companies they'd never imagined they'd be able to work at. We just had a self taught engineer working as a pizza delivery person in Cincinnati, hired by Instacart - his onsite interview was the first time he'd met another engineer in person. We helped a recent high school graduate, who didn't attend college, get hired by Apple - his dream company.

I'm the founder so I'm obviously incentivized to highlight these success stories and I don't claim that we've built a perfect process for everyone yet. We're working on a hard problem, judging the skill of other human beings in a fair way and vouching for them to companies that have maintained the same hiring process for decades and are resistant to change. Our approach is not perfect but it has had life changing outcomes for many people and we're doing our best to increase the % of all Triplebyte applicants that's true for.


Slightly off topic question Harj, but is it suffice to say that there is a finite percentage of companies willing to spend a 25% placement fee on candidates through triplebyte? Surely the time will come again when startups will curtail their spending and thus leave limited options for candidates going through your process? Is it safe to say candidates whom solely use Triplebyte are limited to a fraction of the companies that they can otherwise get in front of via other means?


We've been using TripleByte (in addition to career fairs, etc.) at Expo for a few months and find it really good. Every candidate from TripleByte we've interviewed has been knowledgeable about many facets of software engineering, and it's been great to work with the one we've so far brought on board.

One effect of using TripleByte is that we've been able to focus our interviews more on engineering design and seeing how someone thinks. Since we've found TripleByte's interviews for programming skills and knowledge to be quite good, we get to spend more of our time and the candidates' time on areas other than coding questions.


My experience working with Triplebyte has been great. I found them through hackernews from a post about visualizing the timeline for a job search [1].

Unlike most people in this thread, I thought their interview process was much more relevant and tangible to real engineering applications than most companies I've interviewed at. I was not presented with any gotchas or arcane algorithms questions.

Finally, the big draw to them as an engineer is that it significantly cuts down on the amount of time you spend on the phone with other companies before going on site. I'm still holding a job (hence the anonymous account), so looking for other opportunities was somewhat prohibitive to me.

[1] http://kellysutton.com/2016/10/20/visualizing-a-job-search-o...


The phone number validator on the signup page does not work correctly. Doesn't recognize or validate my valid Swiss phone number in any normal variation although it does correctly change the little flag to Switzerland.


Ah good catch! Can you email me your phone so we can test and fix it: guillaume@triplebyte.com

Thanks!


I wish they expand the process to recruit remote and outside of the USA.


They don't hire remote? I remember they have a strong bias to recruit for Bay Area companies but I remember being asked if I'm looking for remote work too. So maybe they do.


Last time I started the signup process (2 months ago maybe), when I selected the "remote" option, there was a message informing me that they didn't recruit remote. Same with location (USA only).

Nice UX informing it so upfront anyway.


Nope they don't hire remote (or anybody not having US VISA). I reached their phone interview but due to this constraint couldn't go further.


I'd rather have a non-broken interviewing process than involve yet another party to fix it. Too many companies are blindly following Google style interviews, we don't need even more of that outside of the Bay area!


Have you guys considered working with Udacity? They have a "plus" version that does offers placement guarantees or a refund. There might be a synergy there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: