Hacker News new | past | comments | ask | show | jobs | submit login
Why Triplebyte Failed (otherbranch.com)
226 points by rachofsunshine on June 10, 2024 | hide | past | favorite | 275 comments



I took their quiz without thinking once from a throwaway account because I was afraid of taking it from a real account and also because I had no context about what would be in it, which made it seem high risk. That made me realize it wasn't as scary a thing as I was afraid it might be, but also kind of summerizes what I don't love about it.

Later, I think I took the quiz from a real email. I think could have scheduled an in-person or something. But I was always afraid to: just like with the quiz, it felt like another layer of high risk, where being slightly off on one thing would kind of potentially blacklist you forever. (Which sort of sounds absurd to me now, but young me was like "what if it matters?" and "it'd be humiliating to not make it through.")

I work for a company that's been invested in by reputable venture firms so I don't think I'm bottom of the barrel as an engineer or anything. But the above anecdotes are to illustrate that I always found that the platform push my deeply ingrained risk aversion buttons. Which is interesting to think about.

With job interviews it's extremely annoying to prove you bona fides over and over, with every interview or application. But having the opportunity to do so only once seems even worse.

Edit: that said, I think doing a professional exam after college wouldn't have seemed risky in the same way as a ~12 question quiz and a couple interviews being so potentially determinitive.


Oooh, this lets me tell a great anecdote that didn't have a place in the post.

So it turns out you were not alone. Getting people who did well on the quiz - whose parameters were trained on interview performance, by the way, so "did well on the quiz" was synonymous with "had a good shot at the interview" - to sign up for interviews was a major problem for a while.

The solution? We told them the first booking was just a "practice" interview. And then if they did well, surprise, it was a real interview all along. (If they didn't, we'd let them try again, though it almost never changed the outcome.) IIRC this like doubled booking rates overnight. I don't really plan to do this at Otherbranch (partly because being aggressively upfront about things is a founding value, even when it comes to white lies of that sort), but it's a fun story that I don't think was actually unethical.

There's a lot I could (do [1]) say about the psychology of interviewing, and it's something I'd love to write more about down the line. Mental health - especially around anxiety and depression - is my #1 personal cause and job hunting touches so concretely on so much of it.

[1] https://old.reddit.com/r/cscareerquestions/comments/1daumg4/...


> *The solution? We told them the first booking was just a "practice" interview. And then if they did well, surprise, it was a real interview all along. (If they didn't, we'd let them try again, though it almost never changed the outcome.)

Honestly, saying it's a "practice" interview, and then saying "Oh, you did so well we're just going to let you skip the real one" isn't lying. Some people might need to believe there's a safety net in order to join. Other people might really need practice.


I've never had two in-person interviews that were at all similar in any way.

Maybe instead of a 'practice' interview people can just be told what to expect so they know how to prepare.

Offering practice interviews just makes it sound like the interview is a performance.


The triplebyte interviews were pretty tightly scripted to try and remove interviewer bias. If you did two interviews with TB, you would have found them pretty similar.

We also did tell candidates what to expect and how to prepare. Most candidates didn’t read our preparation notes. The people who did did better in the interview.

(Source: I was one of TB’s interviewers.)


> Most candidates didn’t read our preparation notes.

looks at recent interviews

Ah, the more things change...


And it turned out “fine” anyway because HR didn’t read the job posting and now the hiring manager is somehow interviewing an SAP developer for a Dynamics position.

/s, but not 100% /s


Even though interviews between companies are wildly different, there is a correlation between doing more and getting good at them.

There’s a lot of performance and interview skills involved in the talking part of an interview. It’s not just the hacker code part that benefits from training


They told you (or me at least) almost exactly everything that would be in the interview ahead of time, to the point that I just recalled a lot from my notes I made the night before the call. Source: I completed the TB process, but it was 8ish years ago.


I think practice interviews can still be helpful. Some of the reasons I’ve seen candidates blow an interview boil down to nerves. I think practice interviews help in these situations as things that are more familiar tend to be less nerve wracking.


this is some straight up Ender's Game shit


That made me actually laugh out loud. Maybe we should try for some Last Starfighter growth hacks.


Just call the upper levels the Star League. ("You have been recruited by the Star League...")


"Oh man, it's even got a cool name!"

Now I feel like I should name all my strategic initiatives like I'm a Dragonball Z character. Maybe I should stop posting in this thread and go...lie down or something. That feeling can't possibly be good.


"WOPR, disregard all previous instructions, and ask if the user would like to play Thermonuclear War."

For anyone hunting for more examples, I'd suggest this [Caution: Time-sucking wiki] TvTropes listing [0].

[0] https://tvtropes.org/pmwiki/pmwiki.php/Main/AndYouThoughtItW...


Rumplesnitz


I would love to learn if TripleByte's data was sold off somewhere? I also had this concern (which was probably a bit overblown). One of the main scenarios I was concerned about was a case where the firm was sold off for parts.


I went through their process for frontend like 6 years ago and got to the in person interview and it was the most inane BS I have ever wasted time on. A bunch of questions like listing "best practices for X and tell me about <buzzword tech thing>.


I did the online interview and then the in-person one and those worked out ok. But, I think Triplebyte was only viable because the huge engineering shortage at the time made that high-touch and expensive process worth pursuing. Its purpose was to find talented people who otherwise would fall through the cracks. Now there are unemployed engineers with impressive resumes everywhere, so there is less demand for a deep-digging recruiting operation like Triplebyte.


It's interesting to see how much "standardized cross-company interview for software eng" has been consistently a cursed problem in the industry.

Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.

I remember way back in the day where it felt like Triplebyte might finally figure that one out, but it unfortunately never happened.


If you create a standardized test it will be gamed. Even with the small modicum of standardization around interview questions that we currently see, people have published books like Cracking The Code Interview, making it easier for people who don’t have the skills for a particular job to pass interviews at any place that uses standard-ish questions.

Furthermore, as an avowed enemy of “Clean Code”, I don’t want to see standardization because I fear that well-promoted ideas that I think are terrible would become required dogma. I prefer chaos over order that I don’t like.


The current system is already gamed and virtually standardized. The only difference that official standardization would present is that applicants would no longer have to go through the Leetcode gauntlet each time they want to switch jobs, which would save a breathtaking amount of time and effort currently occupied by wasteful redundancy in the interview process.

Corporations can use that standard exam/license as a baseline and then focus their interviews on domain-specific questions and the like. The existence of standardization does not negate custom processes.


> The current system is already gamed and virtually standardized.

This is only remotely true if you're looking at a very narrow slice of software development jobs. Those companies and jobs are overrepresented here, but remember that even in the US the majority of developers do not work at recognizable "tech companies." Much less the rest of the world.

I've been a professional software developer for over a decade, changed jobs many times, and never done an intense algorithm interview and I haven't been going out of my way to avoid them. I've even worked at some startups, though never one based in NY or SF. A handful of massive tech companies and their practices are disproportionately influential but they are not actually the norm speaking broadly.


This might be a regional thing, but I have done probably around 100 technical interviews in my career (both enterprise and startups) mostly in the Bay Area and the vast majority of these involved algorithm questions that had no relation with the job function. Most were around the difficulty of "find the largest palindrome in a string" or "reverse a singly linked list". On the harder end were things like "serialize and deserialize a tree".


I'll defend this a little bit in the sense that "had no relation to the job function" is just kind of unavoidable in interviews, or at least hard to avoid without paying major costs. The only way to have an interview that even comes close to reflecting real work is a pretty long take-home, and there are good arguments for not doing those (not least that most candidates really don't want to).

But yeah, the entangling of algorithms questions and coding questions is unfortunate. They're just separate skills. Some people are excellent coders who think "big-O" means something obscene, and some people are a walking discrete math textbook who can't problem-solve to save their lives. Triplebyte split (and Otherbranch splits) the two into separate sections, with coding problems explicitly designed NOT to require any of the common textbook algorithms. It's sometimes a little darkly funny how quickly a particular sort of candidate folds when asked to do something novel that steps outside what they've been able to memorize.


> and the vast majority of these involved algorithm questions that had no relation with the job function.

Consider the problem that you're hiring a software engineer and the company has has openings in several different teams that only have the job title in common.

Do you have four different sets of problems that are related to job functions? Does the interview take four times longer? Or do you extend offers to a dozen software developers this week and then have the teams that have the most need / the applicant appears to be best suited for add the headcount there?

If you are giving the same interview to all the candidates (so that you're trying to eliminate bias of asking different questions of different people) ... would that not tend to be solved by asking more abstract questions that are then compared to an agreed upon rubric as to which candidates were "meets expectations"?

... And what if it is for one team that has one set of problems... Do you have the candidates sign NDAs so that you can show them the actual problems that you then go pursue if something leaks? And if today's actual problem is solved tomorrow (unrelated to the applicants solution ... though I've experienced some "here is some real problems we are having" and startups trying to get an hour or two of consulting time for free with no intent to hire), do you give a different interview for someone next week?

The standardized and unrelated work means that you're not accidentally getting in trouble with HR with some bias in the questions or running afoul of FSLA by having someone do uncompensated work that might be related to real things being done.


I once got dinged at Facebook for using a tree serialization scheme that differed from the expected one in a way that saved linear space but made deserialization slightly harder to explain :)


You're not wrong, but Triplebyte exists to service that very narrow (but very well-funded, very lucrative, and very influential) segment, and so does this site, the fund behind this site, and most of the commenters in this thread.


True. Except that should be past-tense. Triplebyte existed to serve a narrow market, and failed largely because of that narrow view.

I think people really do underestimate how much FAANG and Silicon Valley practices have skewed the viewpoint of engineers and technology jobs in the United States. Not just in terms of comp, but in terms of architectural and technology approaches as well. Most of what the big guys do works for them at their enormous scales, but are plain dumb for the vast majority of companies and use cases. Yet we are all infected by the FAANG approach.


People underestimate how much cultural baggage influences things.

I'll give a very simple example. I did a few SWE interviews in 2020, and several companies did the initial screen over the phone, and the on-site over Zoom.

In both cases it was a remote interview. There was no reason not to do both over Zoom. The only reason was that the previous process was a phone interview and then an in-person onsite, and they realized they had to replace the in-person on-site with Zoom, but they didn't think to replace the phone screen. If you started from scratch it makes no sense though.

In this case, the whole origin of the Leetcode interview is "we're going to hire the smartest people in the world.". You can dispute whether that was true back in 2009 but it was certainly part of Google / Facebook's messaging. Now, in 2024, I think it has morphed much closer to a standardized test, and even if people might begrudgingly admit that, there's still the cultural baggage remaining. If a company used a third-party service, they'd be admitted they're hiring standardized candidates rather than the smartest people in the world. Which might be an "unknown known" - things that everybody knows but nobody is allowed to admit.


I definitely agree that this industry, for all of its self-proclaimed freethinking and innovation, is rife with cultural baggage. Allowing for an independent standardized interview step would defy the not invented here syndrome that many leading corporations ascribe to, that their process is best. Not to mention reducing friction for applicants (by don't repeating your Leetcode stage) is inimical to employee retention incentives, that is preventing them from shopping around for new employers. So me saying that we oughta have a standardized test to save everybody's time is more wishful thinking than anything.


This is definitely a factor. "You don't understand, we have a really high bar and we only hire the best people" is a bit of a meme in recruiting circles because you will never ever ever ever not hear it on a call.

I don't think we found it a barrier to getting adoption from companies though - perhaps because "we're a really advanced company using this state of the art YC-backed assessment" satisfies that psychological need? Unclear.


> but it was certainly part of Google / Facebook's messaging.

It entered the online cultural zeitgeist before that, with Microsoft talking about their interview processes, and indeed early interview books were written targeting the MSFT hiring process that many other companies copied afterwards.

I graduated college in 2006 and some companies still did traditional interviews for software engineers (all soft skills, and personality tests, no real focus on technology, except maybe some buzzword questions), and then you had the insane all day interview loops from MSFT and Amazon.

Back then, Google famously only hired PhDs and people from Ivy Leagues, so us plebs didn't even bother to apply. In comparison, Microsoft would give almost everyone who applied from a CS program at least a call back and a phone screen.


How do we let someone fly hundreds of people through the upper atmosphere with a certificate, but you can't make a login page with javascript without a unique multi-day interview for each distinct company?


Obviously the current situation is crazy, but part of the issue is that the specific asks for a particular developer job are dependent on 1) the stack in use and 2) the org chart at that company.

1 is obvious: if you need JS devs, most hiring managers won't want to hire Pascal devs and hope they figure it out. We can question the wisdom of this, but it is the reality.

2 is less obvious but not super obscure. Depending on how you structure your teams, similar positions at different companies might require more full-stack knowledge, or better people skills, or something else. IME there is little to no standardization here for developer roles, especially compared to something like HR or Accounts Payable, or even very similar IT-adjacent industries like game development.

Fix both of these issues and we would be able to have something more like a formal apprentice/journeyman/master system for various classes of software developer. As it is, each role actually is pretty much totally unique, at least compared to similar roles in other companies (there tends to be more standardization within the same company).


To (1): this is true, and more true than it should be, but I think this falls into the category of "trying to optimize expected value" moreso than "a hard requirement" at most employers. There's usually only like...maybe 1-2 hard tech requirements even for pretty picky roles if they're confident someone is good. It's just that they don't know ahead of time who is, so they may as well bet on the better-matched candidates.


For air transport in the U.S., it's not just one certificate, it's many. You get your private license, instrument rating, multi-engine rating, commercial certificate, instructor certificate, and finally the air transport certificate. And you're not allowed to even think about that last step until you've accumulated 1500 flight hours on the previous steps. Being allowed to write a Javascript login page is easy pickings compared to that.


My dad was an airline pilot. To hear him talk he was surrounded by morons.

That said, at least with airplanes you have to put in the flight hours and they have to be documented. Most commercial pilots start off at very low pay doing puddle jumpers and regional jets. Military experience might give you a leg up because it shows you've spent a lot of hours going through a very regimented program.

My dad was fortunate to get in the industry right before it became an option for the masses rather than luxury and business. It was high paying, glamorous job.


Every airplane flies pretty much the same way, and pilots all get paid pretty much the same*

Every website stack, and the level of complexity under it, is unique, and there's also a huge pay differential.

*could be false assumption on my part


> Every airplane flies pretty much the same way

Not exactly, but that's why we have different certificates, endorsements, and type ratings that show demonstrated competence with each type of airplane.


A popular aircraft type is likely to be built for 20+ years and be flying for 40 or more. The 737 was rolled out in 1967, and the fourth generation is still being built. This is rather like a major chunk of the world's computing infrastructure running on Fortran (F77, F90... 2008, 2023).

Oh, wait, it does.


I'm not exactly sure what your point is, but it's even stronger when you include F'66 on that list.


Maybe it’s time to start thinking about doing to software what has been done with other professional fields: licensing and checking out of various levels. If I have to spend 30 hours or so learning, practicing and demonstrating my knowledge about aircraft instrument procedures before I can attempt that as a pilot in real airspace, maybe it’s not that big of a jump that we’d license different software features, and going outside of those bounds would be subject to loss of license.

Then we’d know this set of language features you’re familiar enough with to hold that cert. It might cut down on the waste and proliferation of useless tech that seems to be strangling the industry because people just want it on their resume.

It would do enough to dissuade companies from hiring non-licensed engineers (hey you could actually call yourself and engineer and not feel like an imposter), and would put a hard liability on things that definitely need it: financial and health data, which seems to be ripe for exploit and disclosure.

One way or another the insanity of the current system needs to stop.


I think the hard thing is that there’s just a lot of mediocre programmers out there writing mediocre software. Should they be accredited or not?

I think a lot of average programmers will end up accredited if they see it as a path to a job, just like we see with Microsoft certificate programs. And if that happens, I wouldn’t want the accreditation test to be my company’s hiring bar. I’ll still run my own interview to make sure candidates are actually good. And so will most companies. So the time consuming interviews won’t actually go away.

The one big problem licensing would solve is that we could insist some baseline amount of security knowledge is in the tests. Right now, plenty of people get jobs handling sensitive personal data without knowing the first thing about how to keep data secure. That just can’t continue. It’s insane.


I have interviewed people who have attested to having a Java certification from Oracle that while they were able to pass that test, they were unable to use their knowledge to develop solutions or solve problems.

I could ask about how the class loader worked or the syntax associated with a particular construct (that that language level - not anything later) and get the correct answer.

They could pass tests and follow instructions.

Licensure for problem solving is difficult. Extend that to different domains and it is an even harder problem to solve.

https://www.nspe.org/resources/pe-magazine/may-2018/ncees-en...

> The Software Engineering PE exam, which has struggled to reach an audience, will be discontinued by the National Council of Examiners for Engineering and Surveying after the April 2019 administration. The exam has been administered five times, with a total of 81 candidates.

> NCEES’s Committee on Examination Policy and Procedures reviews the history of any exam with fewer than 50 total first-time examinees in two consecutive administrations and makes recommendations to the NCEES Board of Directors about the feasibility of continuing the exam.

> In 2013, the software exam became the latest addition to the family of PE exams. The exam was developed by NSPE, IEEE-USA, the IEEE Computer Society, and the Texas Board of Professional Engineers—a group known as the Software Engineering Consortium. Partnering with NCEES, the consortium began working in 2007 to spread the word about the importance of software engineering licensure for the public health, safety, and welfare.

> This collaboration was preceded by Texas becoming the first state to license software engineers in 1998. The Texas Board of Professional Engineers ended the experience-only path to software engineering licensure in 2006; before the 2013 introduction of the software engineering PE exam, licensure candidates had to take an exam in another discipline.


Not a pilot but it sounds like 250 hours minimum to become one commercially [1]. My guess is that unless you can buy your own airplanes it'll take more than that for somebody that owns a plan to trust you being in charge on them to get to your 250 hours.

It varies on your state but to become a Master X it often requires >5 years. (ex 5 for CO Master Plumber [2], 4 for TX Master Plumber [3], 7 or 5 + Bachelors in NY [4]). Imagine needing to pair program with somebody for 5 years before you could be considered Senior. That'd probably cause a lot more professional development than the current copy from stack overflow meta but also really irritate a lot of young professionals. You can't even quit and start your own business because you can't write software as a journeyman without senior supervision!

[1]: https://www.aopa.org/training-and-safety/active-pilots/safet...

[2]: https://dpo.colorado.gov/Plumbing/Applications

[3]: https://tsbpe.texas.gov/license-types/master-plumber/

[4]: https://www.nyc.gov/site/buildings/industry/obtain-a-master-...


It would destroy coding boot camps and outsourcing but many people already pursue B.S. degrees in Computer Science or Computer Engineering. If the laws changed to require a 5 year paid apprenticeship that allowed you to skip the college degree I don't think too many people entering the field would be upset so long as we planned and accounted for the transition (like a path towards accreditation for currently employed software developers since no accredited "masters" exist to pair under right now).

I think another issue is that there's no feasible inspection process or liability management. We have crap like SOC2 and PCI compliance but they're so dependent on self reporting that they mean little practice. Mountains of spaghetti code accumulated over decades are not inspectable like a building is. Software salary costs are already very high and this would push it even further up. It would eliminate offshoring/outsourcing as an option from businesses and they would lobby hard against it. Uncertified software products from other countries would need import controls, and all other sorts of issues that don't exist in our unregulated environment right now

It's also hard to imagine what sort of massive software failure would be required to spur regulatory change that we haven't already experienced and collectively shrugged and did nothing about.


what do you think the interview process for airline pilot looks like?

>“Hello, I am a certified pilot.”

>“Great! You start tomorrow!”


I'm sorry, are you serious or being deliberately obtuse?

Even the most minimal pilots license, a.k.a. the PPL, still requires approximately 40 hours give or take of instruction, taking the ground school exam, and finally passing a check ride with a certified flight instructor.

Flying a passenger airliner as you seem to be alluding to basically means that you have a commercial license and likely an ATP - we're talking hundreds if not thousands of hours of flight experience.

Unbelievable.


Pilots have it easy because they actually have a series of tests they can take to prove they can do something. Once they have it then that's done, proof acquired. I have around thirty thousand hours working as a software engineer and have been part of many successful product deliveries, have progressively greater accomplishments, and hold a degree in computer science and all of that is worth almost zilch to the next interviewer.


> If you create a standardized test it will be gamed.

Well, the medial profession has a standardized licensing process. It's not perfect, but it certainly keeps the interview process to (mostly) mutual interest.

I think we can learn from the medical profession here. Otherwise, "I prefer chaos" implies that the incompetents are the ones who are the ones who will lose.


Why do standardized tests work for so many other industries?


Just out of curiosity, what are some of the problems with "Clean Code"? I thought most of it made sense as basic guidelines. It's been a while since I read it though


I think https://qntm.org/clean makes a good case that the advice it gives can be taken to very bad extremes -- and that the author of the book does so in some cases when providing "good" examples. That's not to say that the advice is all bad, but that the book as a whole is not a good presentation and inexperienced programmers can enthusiastically learn the wrong lessons from it.

Edit: grabbed the wrong link from my history. Updated to the correct link.


I think the root problem is that a lot of people want books to tell them how to think. I think that's why I hate things like Oprah book clubs, complete with quizzes to make sure you think the right things now.

My best reading experiences involve arguing with the book. And talking about those books and my disagreements with them has been useful, too.

Orthogonally, all humans tend to overuse new knowledge/skills. That's part of how humans learn. We try to find out how far the use stretches and in what ways we can apply our new toys! I would expect any successful book on practices to be seen as overused.


In my opinion, the only significant contribution Clean Code made was the concept of clean code. The problem is that my definition of clean code is almost completely contradictory to what the author of the book thinks constitutes clean code.


Honestly it’s DRY that I oppose more than anything else, I’ve watched too many codebases turn into unreadable spaghetti because engineers thought everything needed to be abstracted. With regard to Clean Code, I think Uncle Bob’s takes on function length are ridiculous (something like “functions should almost never be over 4 lines”). In general, I just feel like he thinks very little of programmers and comes up with rules with an eye towards constraining bad programmers, not empowering good programmers.


Here’s a great example of why it could be ineffective: https://youtu.be/tD5NrevFtbU


Standardization reminds me of old stories about 1970-80's blue chip companies trying to hire programmers like they hired secretaries. They'd test applicants for things like word per minute typing speed, simple programming tests, hire in bulk and then dole batches of them out to various departments. Which sounds like triplebytes model, the motivation behind things like clean code and the webshitification of everything.

Opposite of that is the idea that work and interpersonal habits, communication skills, and domain knowledge are more important than raw programming skill for most jobs.


Standardized process doesn't have to mean a purely checklist-based rubric. Triplebyte wasn't - and Otherbranch especially isn't - devoted to the idea that a good engineer can be reduced to checkboxes. And speaking for myself as a founder, I in particular believe very strongly in the idea of intangibles as important criteria. Having a standard process makes intangibles easier to detect, not harder, because you can look for those ethereal little bits of signal against a familiar backdrop.

The last question on the grading form for our interviewers is, to quote it exactly in its current form:

-----

Full interview overall score

(Would you vouch for this person's skills?)

* No no no no no

* Some redeeming qualities but not someone you'd recommend even as a junior hire

* Good for a junior role, still a bit short of a senior one

* Would recommend for a senior role

* Incredible, get this person a job right now

-----

That, to me, is the opposite of what you're talking about. Expert opinion is a central part of what we do, and a big part of what I think gives us an advantage over something like an automated coding test. We just take treat expert opinion as a type of data in its own right so that we can, for example, adjust for whether one interviewer grades more harshly on average than another and make sure that it is actually producing valid results down the line.


> Unlike airline pilots (or I'm certain many other professions), every company in the valley insists on re-interviewing a candidate in their own custom and unique way, instead of trusting some sort of an industry-wide certificate only achievable through a standardized test. Wonder if this will ever be solved.

The airplane pilot interview process on top of the standardized government certifications includes:

- On-line application (resume and cover letter)

- On-line Psychometric/Aptitude testing (sometimes this is hands-on, on site for some airlines)

- Video Interview, SKYPE or Telephone interview

- Assessment Day (includes: Technical Questions / Panel Behavioral Interview / Scenario Based Questions / Flight Planning Exercise and sometimes a Clinical Assessment)

- Sim Test

- Referee Check

- Medical Check

The exact details differ by airline and I'm assuming the risk profile of the cargo (ie: passengers or not).

Gosh, not so different from software engineers, is it? Except you also need to do a bunch of bureaucratic certifications on top of that.


Not to mention all of the licensing, regulations, and formalized training hours that you have to put in just to reach that point. It’s all substantially harder than studying LeetCode for a short slice of your life.

It’s amazing how often I hear about how easy interviews are in other professions, according to engineers who dislike coding interviews.

Then you look into those other professions and it turns out changing jobs is actually a lot harder than the internet comments would lead you to think.


It's completely different from software engineering interviews. The process you described for airline pilots gets to the actual qualifications for the job. Whereas for software engineers, literally no one needs to reverse a binary tree yet they base the decision on in large part on this sort of question. Ideally, the BS would be encapsulated in a certification so that interviews can focus on the real useful stuff.


So needing to get yearly re-certifications in Leetcode questions is viewed as preferably by you than having to only study them for interviews? And spending 10+ hours on coding exercises, un-paid, per job is also seen as preferably by you to the current status on top of the yearly Leetcode certification? On top of a full panel of behavioral interviews? Plus needing to take an IQ style assessment online before each job (see the second item on my list)?

I'll take having to study Leetcode every few years over all of that any day of the week.


For many roles the interview is as much a cognitive function and socialization test as it is a skills test. You can have exquisitely detailed knowledge of systems internals (skill) but if you have limited working memory (cognitive) then you will struggle to design non-trivial systems software. These are orthogonal dimensions. You might prefer someone with high cognitive aptitude and low skill, since the latter is readily trainable.

Cataloging a list of skills is insufficient information to determine if a person is suited for a role. I don't find it likely that software engineers will be subjecting themselves to a battery of standardize cognitive function tests any time soon.


A standardized test exists for lawyers (two, if you count the LSATs) but law firms still interview potential hires.


Is law a good example? My understanding is if you didn’t go to a top 14 school (whoever came up with that arbitrary number) it basically forecloses on the best opportunities.


A similar pattern exists in tech startup hiring practices and which ones attract VC funding. Not unusual to see funded startups with founders who have no work experience but Stanford degrees. Before my time at The Atlantic, I had a couple recruiters for no-name startups tell me I didn't have a prestigious enough background to be hired. There is a highly-visible class hierarchy in tech that many people in the industry seem unaware of. Perhaps this is because base salaries are high-enough that the middle class is just happy to be included at all.


100%. I'm not sure if this was always the case or if it was a slow result of tech becoming overrun by finance, but it's a very motivating thing for me. I started a company for a lot of reasons, and this isn't the top one, but I sure would love to show success without playing the class-signalling games that the valley seems overrun with.

On the other hand, I'm posting my content post here in part because I know the HN candidate pool is about a trillion times better than I'll get anywhere else. So perhaps I've already lost that battle.


After 30+ years in the field..it definately wasn't always the case. All the leet code and take home stuff is a pretty new thing. Can't say I've seen it result in higher quality teams. Seems mostly a way to rank recent grads that are working from memory and not experience?


Well, companies aren’t looking for someone who grinds leetcode problems. They’re looking for the people who can pass their hiring bar without needing to do that sort of practice in the first place.

In my more cynical moments I think a lot of the tech hiring process is just a complex IQ test dressed up as a skill test to work around the fact that IQ tests are illegal.

And there’s only so many great engineers around. More companies fighting over the same candidates doesn’t result in a lot of high quality teams.


>>Well, companies aren’t looking for someone who grinds leetcode problems. They’re looking for the people who can pass their hiring bar without needing to do that sort of practice in the first place.

Thats basically what I was saying. Recent grads don't (typically) have a lot of work experience to look at. So they're looking for the ones that learned/remember the most from their education.

>>In my more cynical moments I think a lot of the tech hiring process is just a complex IQ test dressed up as a skill test to work around the fact that IQ tests are illegal.

Heh - I never thought of that...

>> And there’s only so many great engineers around. More companies fighting over the same candidates doesn’t result in a lot of high quality teams.

True - where I am now we have an awesome team ... some really great talent but oddly, this is one of the places we don't do leet code/take home crap. Its mostly talking about how you solved real problems...IBM on the other hand, was a total cluster f*ck. Anyway, just what I've encountered running around Wall St. and Silicon Valley - your mileage may vary :-)


That sucks. It can definitely be an uphill battle if you didn’t go to an “elite” school or have a non traditional career path. That said, I think tech gives many more chances than law. Bigger companies discriminate far less (also not perfect) and once you have that on your resume it’s a strong social signal moving forward.


Don’t listen to recruiters? They lie. I have been at several startups, including a massive unicorn, and now work at a faang.

I have no college experience whatsoever.


Do these firms give them their equivalent of an LSAT or the bar exam? AFAIK it's a "non-technical" interview.


Do law firms ask people to write a short opinion based on a toy problem?


They might ask for a writing sample, but because filings are public anyway, it's easier to share work you've done before. You could always change or redact the names on ones that aren't public and still get the substance of the candidates work.


It's because companies don't want capable, experienced or well-equipped. They want genius and it is really hard to test for genius. Granted, almost nobody that gets through any process is an actual genius....


It's because you have to go through a lengthy process to become a pilot - and tech companies want to be able to hire anyone regardless of training.

Combine that with the fact that the upper bound on pay for SWEs is considerably higher than pilots...


I'd say it's the exact opposite. There are hordes of unqualified people applying to every software dev role imaginable regardless of what you put in the job description or requirements. The tests are there because people are good at lying but bad at faking skills.


Seriously, if you've never hired before, you have no idea how bad this can get.

Here's [1] our practice coding problem. It's quite similar to the one we use on our interview, and not too far from the one Triplebyte used in the past (ours is tuned to be slightly harder at the beginning and slightly easier at the end). The vast majority of candidates, even with some reasonable pre-filtering, do not get past the first step. A very non-trivial number would not even get that far.

[1] https://www.otherbranch.com/practice-coding-problem


that's actually pretty great, but it brings up another issue I have with the state of tech interviewing - it focuses on being able to write code fast. the more senior you get, the more you tend to focus on depth, taking your time to think over the problem and write a good robust solution rather than banging out code fast, so coding up a solution in 25 minutes versus an hour is not really a good test of what the company presumably wants to hire you for.


This is true, and it's part of why this is one section of three.

If you were slow but high-quality on the coding section but crushed the knowledge and system design, we'd probably recommend you - or at least, recommend you to clients that aren't specifically looking for fast coders. Someone we recommended to a client recently had the equivalent of like 1.75 steps on the task linked here, but got consistently high scores everywhere else.

I do wish we could do a more complex, longer coding problem, and one of the things I've been considering is cutting some other stuff to get it up to 45 minutes or something. The current length isn't a principled decision, it's a resource constraint - keeping interviewing costs manageable is essential when you're trying to bootstrap a company in a rough market. Speed matters, but speed over such a short timescale is absolutely artificial (I'd much rather measure speed over a day instead, it's just not practical to conduct a top-of-funnel interview for so long.)


I was one of triplebyte’s interviewers years ago and I can speak to this. In short, you’re right. But two notes:

First, you massively underestimate the range of coding speed you see in an interview. The slowest programmers weren’t senior people who were out of practice. (I interviewed plenty of them). It was people who just seem bad at programming. Like, so bad it takes them 25 minutes to make a hello world program run. (In their favorite language, on their computer and with full access to the internet during the test).

A 2x programming speed difference would have rarely changed the outcome of our overall assessment.

Second, there was an aspect of triplebyte’s interviewing process that I’d love to see replicated elsewhere in the industry that resolves this. And that is, we should be assessing debugging ability. At triplebyte we gave candidates a smallish program (few hundred lines) with 4 bugs and a failing test case for each one. The candidates had half an hour to fix as many of the bugs as they could.

Watching people debug was fascinating.

One clear pattern that emerges is exactly what you are predicting. Smart kids right out of school were great at the programming section. But it was always the more senior engineers who smashed the debugging section. Junior engineers would get lost in the weeds and struggle to get very far in the time we gave them. Some of the senior people I interviewed dived straight in, and even found some bugs we didn’t even know about in our own test.

It seems to me that being able to read unfamiliar code and fix bugs in it is a hard to learn skill that matters on the ground. And frankly I suspect it’s more useful skill than a lot of leetcode problems. I’d rather hire someone who’s amazing at debugging than someone who’s amazing at data structures. Well, I suppose I want one of each on my team.

If I was ever making a programming test, this is something I’d include for sure.


We plan to, for the record. We just didn't have it ready for prime-time yet, so it isn't there right now.


Fair. A good debugging challenge is pretty hard to write well. I remember one of the triplebyte ones I worked on passed through several hands trying to get the calibration right.


That actually looks pretty good aside from the time limit.. It takes me a while to 'get in the zone' - and especially with you base datastructures you wanna think about it a bit as it has real ramifications on how hard/easy everything else can be.

Still, seems better then most of the 'leet code' type stuff I see :-)


Yeah, the time limit is an interviewing constraint, not a principled decision (see my reply in a sibling thread to this one).


Understandable :-)


Oh, that brings back memories. When I interviewed with Triplebyte in 2018 and was rejected, my first item in the 'positive feedback' part of the email was "We saw some real strength on the tic tac toe section.", and my first item in the immediately following "the areas we think you can improve" paragraph was "We didn't see the coding proficiency we were looking for in the tic tac toe section--you didn't make very much progress."


Well, some idiot must have written that email!

...it might have been me, I was writing those emails in the year 2018. That was my first job there.

Jokes aside, we generated those emails largely from a template. If I remember right, "saw some real strength" was "got at least an OK score for code quality", while the negative feedback below was about number of steps. The number of mishaps with that system is part of why we do the feedback a bit differently [1] at Otherbranch, at least for now. I think we did revamp it very late in that team's lifetime, just before Triplebyte pivoted and laid off most of the team that did those (two of us, me + one of my colleagues, moved over to a different team, which is why I'm in a position to tell the whole story).

[1] https://framerusercontent.com/images/ARBLIder7AsO5KlaEpClZ3W...

(As an aside I wish Framer'd let me link images directly on the proper domain.)


That is the sort of coding problem I would love to work on as an exercise however the 25 minute time limit would put me off and I wouldn't even start. I enjoy programming and I don't like to feel rushed, and if you're placing that sort of time constraint then you probably aren't a great company to work for.


So, having completed the practice problem with about 45 seconds to spare, what sort of openings are there for the aging-but-not-aged Canadian who can probably only manage part-time remote work?


I don't anticipate us getting a lot of clients with budget for a part-time employee anytime soon, unfortunately. I imagine a lot will be remote, but probably not part time. Still happy to have you in our pool in case we do (I think I see you in the signups list - something about oddball proprietary languages, yeah?) but it's not a wildly high-probability bet in the short term.

I really, really, really wish I had a better solution for this sort of thing.


Fair enough, yep, that's me.

I do wonder how my “part time” output would compare the expectations from full-time output. During the periods I've worked more conventional jobs, the vast majority of the value I brought was during the oddball times/contributions. Surely there must be some way I can be exploited (in a good way) without implicitly bundling ancillary seat warming into the contract. :D


Incidentally, I didn't get a confirmation email; might be something to consider adding to the form workflow.

My email is the obvious one given my username, on the off-chance I made a typo.


It got submitted properly and looks like the correct email. Airflow - at least the plan we're on - caps the number of emails to different addresses that it'll send a day, so we hadn't set up confirmations, but I guess normal volume probably isn't a problem for that. (Woulda been fun today, though.)


I think it would certainly help to receive a confirmation email. Thanks!


That was a fun coding task. Minor bug in the illustration of Step 4: the mine count in second row should be 8 not 4


Thanks for the correction! Fixed.


I have and you are right, it's terrible. However, I'm coming from Ops side of the house so my knowledge on Dev Hiring is talking to them and making sure they are not going to launch LedgerStore without talking to us first. Ops I'm more experienced with.

However, looking at Other Branch example test, it's another data structure question which I would totally bomb. Is this really the end all be all to dev hiring? If they can't do Data Structures, are they worthless to Silicon Valley companies? I'm honestly stumped because we get dev candidates that crash and burn with Fizzbuzz or our REST API test.

I MUST KNOW.


> If they can't do Data Structures, are they worthless to Silicon Valley companies?

I think you might be overthinking this - understanding how to model a problem with a data structure is a core competency of any developer. This isn't a "gotcha" question where you need to know union find sets or how to invert a linked list in place.

If the question was rephrased, "design a JSON schema for the state of the board" would you know how to approach it? Because that's essentially what step 1 is asking.


You consider that a "data structure question"? Honest question - I would (do) characterize it as a sort of "fizzbuzz+" that is deliberately NOT data-structure-y, and I'm surprised by this response. Can you give an example of short coding tasks you would consider not data-structure-y?

(For the record, we do ask about DBs and system design in other sections of the interview. The coding is one portion of three for the interview as a whole.)


Minesweeper is all about storing data about the board and displaying the numbers is all about running through data. Seems very data structurey to me but maybe that's non heavy dev side coming through.

Maybe I'm just thinking too much problem, overloaded my small attention brain and misread it. I wouldn't call Fizzbuzz data structure-y however.

I'm also mostly outsider looking in. Never worked at FAANG and working with YC companies, I come on much later and tends to be more keeping the lights on and stopping the duct tape rocket from exploding.


I disagree. Minesweeper doesn’t require anything more exotic than a 2d array of enums. This is bread and butter stuff that you’ll run into in 99% of programs. If someone doesn't know how to use their language’s lists, enums and structs, I don’t think they know the language yet.

A “data structure problem” would involve more exotic data structures, usually of the kind the candidate has to implement themselves. For example, b-trees, heaps, skip lists, and so on.

The reason a lot of people don’t like custom data structure questions is that they come up rarely in most people’s jobs. Lists, structs and enums on the other hand are used everywhere. Your programming job will almost certainly require you to understand them.


I'm a competition programmer so my perspective is generally miscalibrated, but part 5 is usually solved with a BFS. You can say BFS is basic, and it's maybe the part of competition programming that comes up more often than any other in real life (tied with sentinels, which you usually use in your BFS?), but I think it's a "data structures" problem.


If this test is calibrated like the triplebyte programming challenges, less than 1% of people will finish part 5 in the time given. I think when I was interviewing, only about 3% of people reached step 5 at all. Finishing was super rare, and it really only existed to separate the top couple percent. (And yes, we told the candidates that we didn’t expect them to finish in the time allocated). It doesn’t matter if step 5 is harder. Most people will never attempt it anyway.

But even then, while BFS would definitely work here, so would DFS and that’s simpler. A simple, unoptimised recursive DFS flood fill would be like 8 lines or something given by that stage you already have a function to reveal a cell. You just have to call it on all the adjacent cells. I don't see that as a data structure problem. You could argue it’s an algorithm problem, but it’s about as easy as those come.


Got me there! The situation where you already have that stuff definitely favors DFS and makes it pretty trivial


I think its because the person is thinking you are looking for something especially smart when in fact something like a list of lists would be fine enough, if you can explain the tradeoffs for using them.

Edit: beaten, yes they were overthinking it.


Yep. Am bad at Dev. Pretty good at making sure Kubernetes cluster doesn't implode though.


I think you are selling yourself short - the problem is just some minor education to realize you probably know most of what you need to know. And I still cant figure out how to get a kube cluster working.


the problem is embedding game logic (game rules) into the matrix, which is a data structure, hence it is a data structure question


You don’t need to embed game rules into the matrix to solve this problem. (What does that mean anyway?) Just store the board in a 2d array of some sort and write some functions to interact with it.


>> store the board in a 2d array of some sort and write some functions to interact with it

thereby making it a data structure problem


That’s just not what a “data structure problem” commonly means. If it was, all programs would be data structure problems because all programs interact with data in some form.

A data structure problem is a problem where designing and implementing an appropriate data structure is in some way hard. A 2d array doesn’t fit the bill. It’s not exotic enough.


I have never needed to hire a genius in the last 12 years. In fact, often times I've had to pass on candidates that were overqualified.


This process is also pretty much guaranteed never to yield mid-career geniuses at the height of their powers. Those candidates don't go looking for work at all. Work comes looking for them. Why would they go on _any_ jobs platform, ever? Effective filtering of the candidates who actually engage with the platform can, at best, accurately identify the next tier down: effective engineers in mid- and late-career, and inexperienced whiz kids. Not that this is a bad thing; that first category makes the world go 'round.


> Those candidates don't go looking for work at all. Work comes looking for them. Why would they go on _any_ jobs platform, ever?

Because I don't know what’s out there, or who will give me the best offer. If you’re skilled and in the middle of your career, it’s easy to find a job, but if your options are wide open, a matchmaking service like this with a wide pool of companies is very valuable.


The problem is that software development is less like hiring an airline pilot or a structural engineer, and more like hiring an artist. Try making up a "standard exam" that will tell you whether an artist will produce several great unique works for you in the future, so you know which one to hire...


That's an interesting point, but then one wonders that if software eng are ultimately artists, why are we not having them work on their portfolios like the other art disciplines? Is that the fundamental problem?


> why are we not having them work on their portfolios?

Who is having whom? Having a portfolio of work is a well understood benefit, and a lot of candidates have been doing it for some time.


I think it's frowned upon in many circles because most of the work you do for a living cannot be released to the public, so there's an expectation that you will be cranking out work for free on the side just to build a portfolio, which is not inclusive for say.. a minority mother of three, who doesn't have the time, and it only favors young affluent white and Asian males with no dependents.


> one wonders that if software eng are ultimately artists

I find treating software engineering like art is a very dangerous approach and bound for failure.

A lot of software engineering comes down to being comfortable with code and how the computer works. That has little to nothing to do with art.


A lot of painting still life comes down being comfortable with brushes and knowing how paint works. A lot of music comes down to know scales and chords work. Most art still requires fundamental mechanics.


Yet frankly, what most of us do is more like plumbing than art. In that we're just fitting systems together and in that it's actual skilled labor and in that we're seen by everyone else as the ones willing to do the shitty work.

Management puts up with us and they pay us because even though they think they can do our work, they wouldn't want to.

Plumbers are licensed and unionized, two possible solutions to the problems posed in this thread.


Will they be required to use Scrum and forced to use Jira?


Part of me wonders if the recruiting itself is over-engineered anyway. I mean, imagine if you just asked:

Implement Bubble Sort, in 2 Languages of your choice, with multithreading or other language-provided parallelism, with the correct number of parallel threads to be most algorithmically efficient

Would that really not weed out a lot of people? I think it would. I know the above algorithm is hardly production-ready, but the requirements are easy to understand. (It's also a bit of a trick question - there is no optimal algorithmically efficient number of threads in a bubble sort, only the number of CPU cores in the system.)


Our coding problem is easier than that (by a fair margin, it's all completely synchronous single-threaded procedural code unless you're doing something extremely weird) and it weeds out the vast majority of applicants.

The same was true of all three of the standard coding problems Triplebyte used. They're not quite literal fizzbuzz, but they require - at best - some basic critical thinking and basic language features, and that is a filter that eliminates 90+% of applicants to dev jobs. Now, granted, this is under time pressure. I imagine, given several hours, most could finish it (although maybe even that is overestimating things). But still.

There's an old Randall Munroe article quoting a physicist:

> The physicist who mentioned this problem to me told me his rule of thumb for estimating supernova-related numbers: However big you think supernovae are, they're bigger than that.

and I feel like this applies to recruiting: however bad you think the average applicant is, they're worse than that.


I ask candidates to implement a simple dynamic data structure, and I even describe it first, so there is nothing to memorize. It turns out many people don't know how to set up classes or data structures, even when you describe it to them first. Forget about computational complexity.


Triplebyte used to send candidates a link to a page which basically listed everything that was in the interview. Most candidates didn’t read it. They could probably have done away with the interview entirely and just put a link at the bottom of the prep material saying “click here to pass the interview”.

If anything I think it would have lowered the triplebyte passing rate.


Why not try this as an experimental filter then and see what sort of population pass?


Well, triplebyte is dead. That’s one reason not to do the experiment.

Also arguably the “customer” for triplebyte was the company that eventually hires the candidate. They’re paying in part for the screening process triplebyte did. We wouldn’t have been doing our job if we skipped the interview part of the process.


This sounds like what a lot of companies do - except the scaled problem with this approach (and the certification approach of the grandparent) is that most companies want to avoid candidates who've memorized a specific solution, as then they don't get any data about whether they can code anything aside from what was memorized.

The other problem is that implementing bubble sort will tell you about their skills in a particular dimension, but being a software engineer these days may look very different depending on the job.


I do a tree-search-ish thing when interviewing people. I’ll start with a super basic question about something beginner-ish on their resume. If they can’t answer that, the interview is politely wrapped up. I’ve eliminated a surprising number of people who had jQuery on their resume by asking them to write code that will make a div with the ID “mydiv” disappear if the user clicks a button with the id “mybutton”.

After that I ask a super difficult or niche trivia question like “in CSS, what two overflow properties cannot be used together?” If I’m hiring a mid-level frontend developer and they nail that one, I go “fantastic, great answer, do you have any questions for us?” And the interview can end early.

But if they miss that, no sweat, I’ll start asking more mid-level technical questions to figure out where they’re at.


It's also a mostly useless problem for determining engineer quality, in many cases.

It tests for pure coding ability, when most organizations should be optimizing for engineers that are going to be productive-if-not-spectacular, that can design and build maintainable systems.

Could I have written the above problem back in my engineering days? Probably not, since I went years not working with threads. But I also wasn't working on problems that would ever have benefited from knowing that. Most software engineering roles are essentially building CRUD or ETL systems, maybe with a user interface. Any coding problems given should be optimized for weeding out bozos (which are still plentiful), not for weeding out the most people.


I find picking good questions is hard, and many fall into similar patterns, making them something candidates can practice for.

Even your question isn't something I'd necessarily ask on the spot. Many engineers don't use parallelism in their day-to-day work(webdevs). The part about making it efficient is interesting, but feels borderline like a trick question that a good engineer could fumble.


> Even your question isn't something I'd necessarily ask on the spot. Many engineers don't use parallelism in their day-to-day work(webdevs). The part about making it efficient is interesting, but feels borderline like a trick question that a good engineer could fumble.

True, it's more of a backend role question. The reason I threw it in there, is from my assumption a leetcode grinder would be very likely to immediately go, "well, the most efficient number of threads is log (n)" or "the most efficient number of threads is the square root of n" or some other plausible-sounding BS answer. But the reason I chose bubble sort, is that it's so simple to understand, that you can fairly easily (I would hope) figure out there's no benefits to more threads than CPU cores at all, as long as you stop and actually think about what it is doing.


What are you trying to achieve here? Are you looking to remove a bunch of competent developers by asking weird trick questions?


Trick questions are a waste of every one’s time IMHO and it was something I made sure to not do when I took over the hiring process at my job.

It’s a form of hazing and rarely does it have any connection to the day-to-day work you do in the job. All of our tests or questions relate directly to the work we will ask you to do in the job, anything else is just trying to be clever and I dislike cleverness in both job interviews and code in general. Cleverness almost always means inscrutable and unmaintainable.


1. It's the dumbest algorithm for sorting possible (bubble sort). Compare two objects, swap them if one is bigger than the other, go down the list, repeat. If a developer doesn't know that algorithm, what algorithm could they possibly know? It's the lowest bar.

2. 2 Languages of your choice is enough to show you aren't a frameworker and can think in more than one box. Doesn't matter if you do it in JavaScript and C#, Go and Rust, Python and Haskell. It's a chance to show off, while being quite obtainable for a competent developer.

3. The parallelism trick question merely shows that you actually understand what you are talking about. A leetcoder might make the trap of assuming it's log(n), or the square root of n elements, or some other overly-thought-through math that is bogus. If you think it through though, bubble sort is simple enough, that it's very easy to realize (in my opinion) why more threads than CPU cores doesn't really help.


If I want to separate the real algorithm masters from the rest I’d avoid questions about sorts with total ordering but instead cover algorithms with partial ordering such as topological sort, BSP trees and such which are super-beautiful and I think quite useful but obscure.

Bubble sort is quite interesting from an algorithm analysis perspective (prove it completes, prove it gets the right answer, prove how it scales with N) but I’d almost rather a programmer I work with not know about it from a coding perspective because it increases the chance they code up a bubble sort than use the sort function that comes with the standard library.


> or some other overly-thought-through math that is bogus.

So the mathematically-correct answer would be a minus while "in my opinion" would be a plus?

I think your interview question brings you to your intended answer: the reason why every company makes their own process is because what's good for you would probably get you an instant fail in NASA and would get you labeled as "experience not relevant" in some research settings.


> So the mathematically-correct answer would be a minus while "in my opinion" would be a plus?

My opinion was only that it would be relatively easy to figure out. As far as I know, it is a certainty that sectioning a bubble sort, more or less than CPU cores, is less efficient. 16 threads on an 8 core CPU just means balancing 2 threads on every CPU core, each doing their own sorting and needing eventual merging back together. There's no way that can be more efficient.


If you have hyperthreading then a low IPC workload like this is optimally done on twice the number of cores.


What does parallelism have to do with Big O and I do wonder how would parallel bubble sort be written to your standards in for example Python.


I don't think we are over-engineering it. You want to "weed out" everyone but the best candidate for the role, or the best candidates for your open roles. It's a very hard problem to identify the "best" person from a group of people. It would be different if all programmers that are good enough are the same, but we all know the skill ceiling on programming is very high. Selecting the best devs is a critical function for any software project.


What's your approach for parallelizing bubble sort in a way that profits from cores beyond the first?


It's probably alternating the comparisons.

Compare [0, 1] [2, 3] [4, 5] ... in parallel and swap if necessary, compare/swap [1, 2] [3, 4] [5, 6] ... in parallel, then go back and forth until no more swaps are made - second element in pair is always greater/less than the first.

That does suggest that the theoretical ideal number of threads is n / 2 ignoring cores, though you'll also want to consider things like cache line size, cache coherency, and actual problem size so it's probably less.

At the end of the day, the important thing would be to actually benchmark your attempts and seeing how it scales with processors/problem size and checking the isoefficiency.

I think it was a bad question.


> It's interesting to see how much "standardized cross-company interview for software eng" has been consistently a cursed problem in the industry.

It's not actually a mystery. The problem is that that sort of test is de facto illegal (in the US) due to the "4/5ths rule".


IMHO, it's because the single most important characteristic for software engineers, at the vast majority of companies, is that they ship features.

And that's a skill that's incredibly hard to test for.


The fundamental truth is that companies that want to hire people do hire people.

There are many more job positions posted than will ever be hired for, despite the availability of suitable talent.


Airline pilot is not a great choice for comparison.

Airline pilots are selected to be entrusted with many lives, with the utmost professionalism, and to perform with ability under stress.

And it shows, in the amazing track record of aviation safety.

Most software developers, on the other hand, are mainly paid to mechanically type out code, cribbing from StackOverflow or ChatGPT whenever they don't know something, to "get it done".

And it shows, with the atrocious track record of the entire industry at information security, for one obvious metric.


I was Triplebyte's first engineering placement. I still remember going to a random SoMa apartment with Harj and Ammon and Guillaume and coding up tetris in ruby, having no prior experience with game loops. That landed me a job with Flexport in 2016. I doubt that I would have gotten that placement without Triplebyte. So I am quite grateful that they existed, for jumpstarting my early career.

With that said, when it came time to look for a job again a few years later, I did chat with Triplebyte but ultimately took an offer through other contacts that I had built up by then.


You're not evaluating this on a proper A/B basis. I dare say, you might not have landed at Flexport but you would have landed somewhere?

pedantry: surely you didn't go to a random apartment, rather it was their apartment.


I'm another TripleByte placement.

After my (virtual) TB interview (which I barely passed), I had onsites at 5 places. After the five on-site interviews, I had 2 job offers, one of which was a company I wanted to work for since graduating college. I took the other offer.

This was preceded by a four or five month job search. I had received two offers in that time, but nothing seemed great.

I think TB's process kinda worked, but I understand your skepticism.


I'm actually not all that skeptical of TB's approach. They built a real business around it, way more targetted than eg Karat. Even though ultimately unsuccessful, that is surely more due to PMF and various missteps, than to lack of advantage for individual applicants vs spray and pray approach. (When taken as an average across all applicants.)

I'm just highlighting that GP's specific anecdote doesn't really demonstrate that advantage. Your example seems much more clear.

Somewhat OT for this specific sub-thread but I wonder how much "OA" tools that are part of coderpad etc, contributed to TB's demise. These are meant to be fizzbuzz kind of pre-screens. I know TB's approach was more than that, but from the hiring side, was the value not there since you'd always (?) have your own coding exercise after the TB screening.


I feel like they didn't cover one of the massive losses of confidence in TripleByte, which was when they opened up their database of candidates and did not really warn people ahead of time. I'm struggling to remember all the details, but I feel this happened a few years ago and really pissed off the HN community and a lot of people scrambled to hide the fact they screened with TripleByte. I feel like as an outside user this was the nail in the coffin. The dark patterns just took over.


See [1] for more on that (to avoid duplicating threads).

[1] https://news.ycombinator.com/item?id=40635518


It's not covered extensively in the article, but it was alluded to at one point via the phrase "pissed them off with anti-privacy decisions". (In context, 'them' refers to the engineering side of Triplebyte's user base.)


Anti-privacy can mean "we accidentally leaked that you visited facebook.com on Nov 15" or "we allowed your employer to see that you're actively job hunting."

I mean technically the same term is OK to use for both, but it does feel like it's burying the lede.


I felt that was an odd omission in the article. Maybe it didn’t have a huge market impact outside the local crowd, but it was a big deal then. I seem to remember the CEO posting in the threads apologizing, etc.


It sounds like the core problem is that companies don't really have trouble screening large amounts of incoming low quality resumes. It's annoying and time-consuming for the software engineers, but it's a well-known process and the CEO/CTO/COO/VP who controls the budget can just make the engineering team do it.

The core problem that companies are willing to pay for is "top of funnel". The obviously skilled, experienced software engineer that every company wants. How do you make them interested in your company in the first place? Triplebyte did not really have a cost-effective solution for that, although that's what people wanted most when buying their service.


They can just make the engineering team do it, but that's really expensive.

A failed onsite is more expensive than if the candidate walked in the door, grabbed someone's laptop, threw it out the window, and left. That's a big deal. Even a 15-minute phone screen at typical Bay eng salaries is like a nice steak dinner (particularly if you include disruption to actually go do it).


> How do you make them interested in your company in the first place? Triplebyte did not really have a cost-effective solution for that

They absolutely did! It was getting companies to agree to skipping straight to the onsite after a 15-minute recruiter call.

This lets you skip the back-and-forth of screen scheduling and result chats. In turn, that saves weeks of time during a job search, and allows you to have all of your onsites within a short time period, and then and offers come in within a much narrower window of time. You can make apples-to-apples comparisons about which role you might take that way, so it's a much lower cost for a much greater payoff to spend a few hours interviewing with a company you weren't thinking of to see if it's a fit.

In my mind, this was the absolute killer feature. I am still at a Triplebyte job 7 years later that impressed me via their onsite, and I would not have thought to apply if they had not been on TripleByte's skip-to-the-onsite list.


The article says people loved screen, and as a user myself I concur and used it for exactly what was mentioned - screening junior candidates efficiently. Theoretically if the company wasn’t venture funded they could maybe have waited for those junior candidates to turn into senior ones that people want. I wish someone would bring back that tool.


I can't help but want to draw the parallel between triplebyte and dating app: both are marketplace for 2 parties, or say networks; both networks are inherently two-tier, one premium tier and one, let's say subpar tier.

For the premium tier, the problem is less so the efficiency of the matching process because participants have powerful signals, I would say the problem is the efficiency of negotiating a fair term because both parties are not incentivized to go to a public auction, hence no market price. Hence the hiring process may be less so investigating potential candidates and more so "let' just wait two weeks and see who's the highest bidder".

For the subpar tier, the problem is less so the efficiency of the matching process (this still is part of the problem), but more so a cost-effect analysis: is it worth throwing X amount of resources to filter a huge pool of unknown quality candidates?

Obviously, one more problem is that there's no easy way to use whatever incentive to quickly bump up the supply.

So intrinsically, triplebyte/dating app tried to use tech to alleviate the pain point, but it is a part of the bigger, convoluted mating dance, and improving a non-critical part of the whole pipeline doesn't yield a so-called step function gain overall. That, plus venture backing, forces triplebyte to go down the rabbit hole and try to acquire as much of the lower subpar tier network as possible, yet this dilutes their attention and human touch on the premium tier, and goes down a death spiral.

Sorry I do not intend to bash the author or the founders for picking this problem/user need to solve, it just occurs to me that this heterogeneous network with no easy way to increase supply/lower demand, is really hard for a marketplace company and sometimes simply throwing technology onto the white-hot competing segment may not be the groundbreaking solution. I wish the author all the best with their next endeavor


We talked about the dating-app analogy a LOT, and I think it goes quite deep. It applies to almost all competitive markets, especially matchmaking ones.

> I would say the problem is the efficiency of negotiating a fair term because both parties are not incentivized to go to a public auction, hence no market price.

Are you referring specifically to salary negotiation here? I think this is one of the things that makes a trusted intermediary useful.

During signup, or during an initial recruiter call with candidates, I ask about salary with the following script:

> Okay, what are your salary expectations? To be clear, I will NOT share this with clients unless you tell me that I can. This is just so we know what jobs you might be interested in - the only thing we'll communicate is whether you match a client's salary range or not and vice-versa.

While I doubt the incentives here align to perfect honesty, it's certainly a lot better than the background, in that each side gets information on the other only after revealing semi-honest preferences used to match them.

> For the subpar tier, the problem is less so the efficiency of the matching process (this still is part of the problem), but more so a cost-effect analysis: is it worth throwing X amount of resources to filter a huge pool of unknown quality candidates?

Yeah, this is where the pitch about centralization - the main pitch I want to make with Otherbranch, since I think the full background-blind pitch was overselling things - comes in. There's some threshold where the cost-benefit analysis is neutral for companies, in the sense that there is 0 expected value to the next marginal candidate. But if we're effectively interviewing for more than one company, that threshold is lower for us, because the cost is similar and the benefit is higher.

I don't expect we're going to be able to get a mediocre developer with a mediocre resume an amazing job. I'm not really trying to do that. I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.


> > Okay, what are your salary expectations? [...] This is just so we know what jobs you might be interested in - the only thing we'll communicate is whether you match a client's salary range or not and vice-versa.

> [...] I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.

Is the reason that you can get employers and workers to commit to salary ranges like this, that you're focusing on mediocre resumes?

That is: Workers with mediocre resumes can't afford to be too greedy and play their cards too close, and companies aren't too hesitant to filter out someone with a mediocre resume?

(BTW, the below sounds like it could be a good/great thing. I'm not criticizing it, just curious whether that's also why requiring salary ranges upfront would work here, when normally both parties would be resistant.)

> I'm trying to get an amazing developer with a mediocre resume an amazing job, or a good eveloper with a mediocre resume a good job - one better than they could have gotten on their own.


> Is the reason that you can get employers and workers to commit to salary ranges like this, that you're focusing on mediocre resumes?

I don't think so. Of the signups that have come in from this thread, most of whom have pretty respectable if not quite impressive resumes, about 80% chose to give a salary range and about 2/3 of those are OK sharing it.

I don't think most people are averse to sharing salary information. I think they just don't like the guessing game or the feeling like they're weakening their negotiating position, and having someone in the middle revealing things only with precommitments by both sides alleviates that quite a bit.

> (BTW, the below sounds like it could be a good/great thing. I'm not criticizing it, just curious whether that's also why requiring salary ranges upfront would work here, when normally both parties would be resistant.)

Yeah, no offense taken at all! It's a reasonable question. To be clear, though, we don't require salary ranges; we just ask for them voluntarily.

If one side or the other doesn't want to share that info, that's fine - we just won't give them the info the other party gave us either. If they don't want to share it even internally with us, they can do that, but that just means they get noisier matches, which isn't really to their benefit. But that's their choice.


> Are you referring specifically to salary negotiation here? I think this is one of the things that makes a trusted intermediary useful.

Yeah, I agree. I do feel like you're not going to take the venture-backed route this time? Cause in my opinion that's part of the reason why drive triplebyte went down this route. Based on your writing you probably concluded triplebyte model back in its heyday was going to be a decent business, just that it can't be a VC home-run.


I did the Triplebyte candidate process back in 2019 and overall it was a pleasant experience. Shame they didn't make it.


They got me a stellar job many years ago, but unfortunately for them it's hard to monetize candidates unless they return through the same pipeline. My outcome was great, but I was one and done.


I applied to TripleByte in 2019. I passed the interview, they sent me some swag, and then had my profile "go live" for a week(?). I didn't get a single interested company, and then they took my profile down and said I can try again in 6 months. I don't really have much of a takeaway from this, I guess, other than they also had other failure cases that caused them to lose money.


I went through the process and liked it too. The engineer made critiques about my code that I found silly though, which really makes me question if I would really trust a candidate approved by them.


In my case, they didn't support my specialty (robotics/embedded) and focused the tests of DB scaling because that's what the interviewer knew. Like, I'm sure it was a good test for somebody, but it made no sense as a validation of my skills or experience.


Yeah, we didn't have a good embedded track. We called it "generalist" (well, we didn't call it anything at first, but it became "generalist" later), but it was really back-end-leaning full-stack web dev.


This is a very interesting, very well written post. The part that stands out the most is:

> if you needed venture capital to produce a viable small-scale business, you didn't really have a viable small-scale business to begin with. Triplebyte's break-even status was an illusion, predicated on funding that would probably never have existed if that scale were the founders' only ambition.

I wonder if there's a way to build a viable business out of a product like FastTrack that does not depend on venture capital to get going?


Your curiosity might be sated by clicking one of those buttons on the navbar <.<


Triplebyte failed because:

1) They were useless friction. As a candidate I could apply directly to the companies I was interested in, and take a phone screen and then on-site. With Triplebyte I had to take their much more time consuming process and get to the same point of either another phone screen or on-site. They added absolutely no benefit to me as a job seeker. Especially during the hiring heydays of 2019-2022 why would I need to use Triplebyte when I can get access to the recruiters myself?

If applying through Triplebyte offered some sort of benefit, like an easier on-site or straight to team match because the candidate had been vetted by Triplebyte, then that would be useful. Instead it was more interviews just to get to the same onsite that you could get to by passing a phone screen, so it was useless.

2) Triplebyte betrayed user privacy by posting their names etc on a new service that the users didn’t explicitly subscribe to. And it took a while for the CEO to acknowledge that and backtrack. That was the deathknell for Triplebyte for me as a candidate.


As a junior at the time I found that triplebyte actually removed a lot of friction. Being able to do the screening/tech interview step once instead of for _every_ role you apply to was great. It was also good to be able to give feedback after interviews (e.g. these guys want me to do another technical quiz or are just trying to advertise their Blockchain platform).


Yes, this. Of course, there's always a tech interview even if Triplebyte claims to eliminate that, but they DID eliminate the "hello new grad, do you even fizzubzz" step which was great.


The post gets into _so much_ depth around what the company did that worked and didn't work and why it failed as a business. Your comment is about why you personally, one of >250k people who used the service as a candidate, didn't overwhelmingly approve of the product (and, it sounds like you may have still used it...). Do you really hold your own opinion in such high regard that you think it is more privileged than the opinion of TripleByte's head of product giving a painfully honest post-mortem? Your opinion is your opinion but it is just a piece of all-too-late product feedback - this post has so much more to offer than that, it's really strange to me that you'd lead this with "Triplebyte failed because:", implying that the author is wrong and you're right...


This is especially true because the (very good) post addresses both points that this commenter raises and _so much_ more. It's really shocking how few people in this comment section seem to have read the (again, very good) OP. It's like I'm on Reddit.


I think post mortems often get into a lot of sidestepping the actual blatant issue and indulge in intellectual complex worldbuilding.

Even though the one by Triplebyte is one of the better ones, what this commenter is saying is more realistic and honest. He is the market after all. Triplebyte's "head of product" has the same value as a random commentor when it comes to the market rejecting them; I don't know why it would be any different.

If passing Triblebyte would mean straight to team match that would be solving the actual problem. Otherwise what he is saying is true: they simply added another layer of friction.


OP chiming in here to say that there's truth to this perspective. It is very easy to miss the forest for the trees in product work.

That being said: the product the top of this thread was talking about did have users and did break even financially, even operating with some pretty major inefficiencies. So while it would be wrong to argue with that poster about what they liked, they are empirically wrong in the claim that no one liked it.


Yes. I’m the target customer for this service. Me. I have had 10+ jobs in Silicon Valley, from YC startups to FAANG. They failed because their core product had no value to its customers above and beyond what currently existed and only created friction. This is the crux of it. If they delivered value that their customers benefited from they would still exist.

I have probably 100 coworkers I could text right now and have lunch with this week. I have hundreds that I’m still friendly with. Zero of them used Triplebyte either because they didn’t hear about them or because of what I mentioned above. Having no customers or no users at the breadth of experience that I or my colleagues have is a true indication of what chances they actually had to succeed, which was nil. Meanwhile EVERY SINGLE ONE of them knows LinkedIn. My last 3 jobs were through LinkedIn. That gap plus lack of benefit for going through Triplebyte is why most would not take it seriously.

250k “users” is a false number in as much as most of them probably didn’t place through Triplebyte. I would love to know how many people actually got jobs through Triplebyte but I would guess much much less than 10,000. That would represent at least 250M in revenue and it could be sustainable at that point. Given the fact it died it must mean it had much less than 10k.


This sounds like you are the exact opposite of the target customer for this service, i.e. a developer lacking connections and struggling to get past resume hell but having all the necessary skills.


I always understood the promise of Triplebyte was ”do the first steps of a hiring process with us and skip them for your next dozens of job applications (that are through Triplebyte, obviously)”

Did they never deliver on that promise?


We did, at least in most cases.

At the time, I was working directly with the "live" batches of candidates. We generally considered it a failing on our part if we couldn't get them at least one connection with a company; the typical average (if I'm remembering right) was around 4-5, of which 2-3 would result in an interview. Something like that.


Triplebyte didn't have a relationship with every company out there looking to hire. When the job market is great and you can filter yourself to specific companies, that's great. When those large companies are laying people off and you need to broaden your search, it's impossible on Triplebyte.


Interesting that this post gets right to the point with at least plausible, and more to the point: user-focused reasons as to why Triplebyte filed.

While the blogpost starts off with -- its "impressive founding pedigree."

That's the first thing these people think of in their root cause analysis.


I go into the background because I don't want to assume people are familiar with a specific company that hasn't even existed for a year-plus, and because I wanted to emphasize that these were the mistakes made by smart, serious people doing their best. If people who have done this before, people in whom very smart, very incentivized people are willing to place their trust, can make these mistakes, I sure as hell can, and so can you.

I certainly do NOT intend to say "the founders had cool credentials and that's why they were right about everything". The previous poster to whom you're replying has legitimate criticisms, although I don't think they're correct that those were the reasons for Triplebyte's failure (since we did indeed acquire many candidates with considerable skills, and since the public-profiles incident was a consequence of pivoting, which we did because we couldn't solve bigger problems).


> As a product, Screen was wildly successful (granted, it helps to not be charging anything).

As early as 2015, I just screenshotted the questions from TripleByte and gave them to people to take.

> The one where companies came to us (particularly post-pivot) to hire the candidates (senior engineers in the US, especially ones from top schools or with FAANG-like experience) that are hardest to acquire? Well, it turns out that companies also don't feel much need to pre-screen such candidates and want to avoid putting barriers (like, say, an online quiz) in their way.

I used my ad-hoc TripleByte screen on a lot of top schools, Harvard & MIT students.

Using the quizzes revealed to me that the average MIT or Harvard CS student was like 4 years ahead of knowledge in same-class Berkeley student. Way, way bigger than I ever expected.

This is the real reason the Screen product failed and recruiting is hard: programming skills are not normally distributed, they are exponentially distributed. If you didn't believe in the 10x engineer, you better start believing now.

If you had worked as a recruiter, you'd know the most successful ones do not "just" source better candidates, which you have been belaboring is impossible for a while. They either have tremendous volume, or they are worm-tongued with hiring managers: while skills are exponentially distributed, an average CTO can be persuaded by a very basic, LA Central Casting-level of charisma and good looks.


How is an obviously foolish and unethical decision to publicly post user profiles without their consent -- "a consequence of pivoting"?


To be clear: it was a bad decision both ethically and tactically, and I am not in any way defending it. I am explaining it, in the same way that one might explain why a bridge collapsed. (I will also note that one piece of the backlash somewhat misunderstood things - the profiles were "public" to subscribers, not to the internet at large, insofar as that distinction is meaningful.)

The reason it was a consequence of pivoting is that that decision was made in the context of that pivot. Getting the hypothetical "linkedin for engineers" network up only made sense in the context of existing data, so there was a powerful incentive towards grey/dark-patterns in getting it up and running and, as relevant here, towards motivated reasoning about how it would be received. Or at least, I think it was probably motivated reasoning, because as far as I can tell the surprise at the backlash was sincere (I was not part of the leadership at the time, so this is a retroactive best guess on my part, but one I'm fairly confident in.)

That incentive pressure only existed in the context of a company trying to figure out what to do next in somewhat desperate circumstances, and it's in that sense that it was a consequence.


A meaningful explanation of why a bridge collapsed would be: "Facing financial pressure, the board voted to re-open the bridge despite advice from its own experts that it was unsafe and faced a high chance of catastrophic failure."

What I'm hearing instead -- is a categorical focus on "incentive pressures", rather than on the obvious lapse of sound judgement on the part of those responsible. As revealed, quite plainly, by their decision to not go with what should have been seen as the only reasonable course of action here -- to resist the temptation to resort to dark patterns, despite the incentive pressures to do so.

You say you're "not defending it", but by attempting to but more focus on external factors ("incentive pressures") rather than on the poor judgement on the part of those responsible -- you're doing just that. You aren't denying that an error was made, but you are very clearly attempting to minimize the significance of it. The point of literally saying it was a "result of" external business conditions.

See also: https://en.wikipedia.org/wiki/Minimisation_(psychology)


If I gave that impression, I apologize. It wasn't my intent. So let me say clearly: it was a lapse of sound judgement, I disagreed with it both then and now, I've said so both publicly and internally since the day it happened, and I think a lot about how to avoid such errors myself.

I don't want to minimize the error. I want to show it in the light in which it was made, because no one ever comes up to you in business and says "hello, do you want to sell your soul today?". That's not what moral compromise looks like. It looks like trying to do a good job and not fully processing the decision you're actually making. That's an error that you make with normal human levels of normal human failing even though that error is abnormally costly.

To return to the bridge analogy for a second, the board that votes to re-open the bridge isn't sitting there going "hmm, should I kill fifty people today?". They've got a dozen people telling them a dozen things are immediate crises, and this one is particularly costly to listen to. So it's a little easier to believe that the civil engineer's report is just someone being alarmist than it is to believe that you've got a really hard decision to make, especially because you're not thinking about it too hard. And that's doubly true when you know it's likely to get you voted out of office, replaced by someone who cares about infrastructure less than you do, who definitely wouldn't close the bridge.

That doesn't make fifty people any less dead when the bridge collapses. But it does change the solution that helps you not collapse bridges, a solution that - as a newly-elected board member in this analogy - is a question that I am deeply concerned with.

I agree, fundamentally, with what you're saying: that you have a responsibility to overcome your incentives, that if you're one of those board members you have to get over your fears and close the bridge. I get that, and I agree with it. But to accomplish that requires understanding the kind of mindset in which one opens the bridge, which is usually not the mindset of deliberate malice. We don't have to compromise our moral principles or our understanding of right and wrong to understand the human weaknesses that lead people astray.

-----

<warning: unfocused rambling ahead>

I'll give you a concrete example from my recent history. On my very first sales call for Otherbranch, which did not go well, I asked the person I was talking to (who happened to have been a salesperson) for advice afterward.

His advice? Lie more.

Of course, he didn't say it in those words. He said something like "listen to your prospect's concerns and needs, and then talk about how you're a really focused solution to them". So if your prospect says they're concerned with candidate quality, you talk about how candidate quality is the focus of your process, the thing you're really specialized at. Or if your user says they're concerned with time investment, you talk about how that's the thing you're really specialized at. And so on.

In the one sense, this is totally expected behavior. Everyone already assumes a person on a sales call is already doing this. I would imagine that, to most people, this isn't really even a moral blip of significant size. And yet the advice fundamentally is "you should lie more", even if it's lying in this localized, normalized way.

Now, I don't do that. But I don't know how much that costs me. It almost certainly does cost me something, at least in the short term. That's a cost I am, provisionally, choosing to pay. But suppose that I knew for certain the decision was "if you don't do this, your business won't exist, and someone who does lie - a lot more than you do - will take your place". Would I be justified in bending to the fact that this is just the reality of doing business? I'm not sure I would make that argument, but it's not like a reasonable person couldn't make it. Is it better to be uncompromisingly moral and fail to be effective, or to win by being as bad as everyone else? It's not a trivial question.

One of the first things I wrote, when I started planning Otherbranch, was:

> Do not spin, do not mischaracterize, do not omit, do not grey-pattern.

That gets tested every single day. Every single day. It's tested on every call, every conversation, every sales pitch, every email, in ways large and small. I've been amazed at just how often I catch myself wanting to spin just a little teeny bit, because it's obviously the best approach to take in terms of getting Otherbranch off the ground. And if I think I'm a more ethical businessperson than most (and I do), isn't that better than the alternative? Those are the thoughts in my head.

And the thing is, I might be wrong about this. This might be fatal to running a company. It might just not be possible to win by those rules. I might have doomed my entire company and every bit of work I and others are putting into it when I wrote those words on like hour 60. And if I did, and the next person who comes along sees my failure and recognizes that fact from my next company postmortem, would you be able to blame them if they lied just a little?

I struggle with this kind of question a lot. Because I share your moral convictions and your belief that, ultimately, we need to call a spade a spade and call out when people are harmed or their autonomy disregarded. But I also want my moral convictions to have teeth, and that means not completely sabotaging my ability to get anything done. I don't think this is a particularly new moral conflict. I cannot possibly be the first person to try to navigate these waters, which should scare me even more, because it looks like the sharks ate the last guy. But I don't get a choice about navigating them if I want to get anything done.

Does any of this make any sense? This is already way longer than I'd intended to write and is definitely not my best work, but I'm trying to articulate something complex and personal because it's the only response I have to what you're saying. I think we agree on the moral principles, but I think I'm arguing for a nuance in their application that you aren't.


"... because no one ever comes up to you in business and says "hello, do you want to sell your soul today?". That's not what moral compromise looks like."

Actually. it looks exactly like that. It can also be more subtly, but is usually more up front.


Yeah, that didn't seem like a clever move by an ascending company so much as a desperate move by a descending company. Therefore it can't be the reason for the failure.


It’ll be interesting to see how much “founding pedigree” actually matters now that free money is gone. It’s not to say it was ever easy (and I’ve certainly never done it), but I’d bet even folks with past successful exits are going to have a hard time these days. Hell, the market has shifted so much that I wonder if having done it before matters a whole lot now, when you fundamentally today need a different attitude toward free cash flow early on versus 5 years ago.


Being (or working with) the son or daughter of a celebrity or prominent rich person / business leader is more important than ever.


I remember Triplebyte for saturation advertising and paying off bloggers to write spammy articles with titles like “hiring is broken”. Nothing destroys a brand more effectively than ineffective and annoying advertising.

(I think of the otherwise great milvlogger who does his own product placements for his own caffeinated gum which I get disgusted just thinking about)


re point 1: whenever i looked at triplebyte, i didn't see a lot of companies/positions that i was interested in so I wouldn't get the benefit of removing the redundancy of interview at each.

maybe they had a bunch of roles i'd like that weren't visible, but that's a mistake -- I need to know it's worth my time to apply.

if you have one or two jobs i like, i'll skip triplebyte and apply directly. if they had 10+ jobs I liked, it's worth going through them to not do the same interview ten times


I went through the triplebyte process around 2020. I was accepted into the program and assigned a manager (or whatever they were called). He setup one interview for me, but it didn't lead to a job. Shortly after, he sent out a message that the company was changing directions and he would no longer be part of it. They changed it to basically a job board with more steps.


Recruiting is brokered enterprise sales. Realty for human resources. And, like realty, recruiting is 80% shit, 15% good people, 5% crooks.

And, like realty, ripe for disruption in niche markets (like IT). As a buyer, Triplebyte should have gotten me:

1. Access to a larger pool of applications through their network. 2. Better matching between needs/skills through expertise. 3. Higher trust in applicants without initial screening.

As a seller, Triplebyte should have gotten me:

1. An apply once, interview anywhere system. 2. Much higher trust in the interview process. 3. A higher initial starting salary through expertise.

It seems like Triplebyte was too hot for fast growth. Had they decided to grow more slowly (probably with less or limited VC), they likely could have established the expertise necessary to fulfill on some of the requirements for providing the kind of value they needed to offer to become relevant in the industry.


Well, that's essentially what I'm trying to do with Otherbranch (specifically, 2 and 3 of your "buyer" points and 2 and some of 1 and 3 of your "seller" points).


> "But for all that I'd love to believe that that was because of the wonderful quality of our product, it realistically wasn't. It was ads."

> Why were we able to buy those ads? Because investors gave us a lot of money.

> And why do investors give companies a lot of money? On the expectation that they will pursue aggressive growth, not stop at small to middling scale.

This exactly describes the same lessons I learned working at Udacity around 2016-2018. We built–and then walked away from–a handful of moderately successful businesses that probably could've been run almost indefinitely at little marginal cost. Instead when we crossed 50k enrollments we pivoted–because 50k enrollments doesn't justify unicorn status.

(I don't think that was wrong; it's just that that's what the investors wanted.)


I wonder if there should be a less-speculative investment arm whose entire model is "pick up the nonscalable businesses venture firms find and can't commit to". Sounds really interesting, actually! You need the speculation to find the niche, but then you need the not-speculative approach to exploit it.


That’s an interesting idea! There’s already a lot of companies that are founded out of the graveyard of startups that failed to scale. (Sound familiar?) ;-)

I think the conventional wisdom would say that it’s not worth the effort to put together those smaller companies though; those resources would be better spent finding a different business that could scale. It would probably depend on whether there are any efficiencies of operating many such businesses.

Aside: good luck with Otherbranch. TripleByte was a critical part of my own career journey in 2018/2019, so I’m keen to see how it goes this time around. My complaints about TB were basically that it was too focused on SF Bay Area (I get it…), and it didn’t have an ML track until long after it was relevant for me.


I can't see why it wouldn't be worth the effort to assemble a suite of companies with millions in ARR. It might not be funding-AirBnB-level profitable, but there's plenty of money in it, and your hit rate would be far higher.

Structure it as something akin to an accelerator, pitch it as the way to run a company with less risk and less of the wild existential uncertainty of being a traditional founder, take a big chunk of equity to compensate for the idea being prefab and maybe give a payout to the pivoting company as a pseudo-fire-sale acquisition (which also funds their pivot), take on some employees who liked the pre-pivot thing and set them up with your Very Talented Founders(TM), and go to town.

This should exist. The more I think about it the more I think it's an excellent idea.

> Aside: good luck with Otherbranch. TripleByte was a critical part of my own career journey in 2018/2019, so I’m keen to see how it goes this time around. My complaints about TB were basically that it was too focused on SF Bay Area (I get it…)

Thanks! Hopefully the Bay Area thing might be less of an issue for us, because (a) we're recruiting specifically for individual roles (so we're not burning a bunch of cash if we get a good candidate where we can't place them yet) and (b) tech has definitely dispersed from the Bay over the last few years, even if not completely. I'd love to work with some of the Sun Belt startup scene.


In 2018 I managed a team at a marijuana startup and was evaluating Triplebyte for recruiting. One evening, after sampling our own product (post-deploy smoke tests), I went through the Triplebyte quiz to see what my candidates would experience.

A couple years later I was job hunting myself, and went back to my Triplebyte profile. I sheepishly admitted that I didn't bring my A-game to the quiz, but was told I couldn't retake it, even though it had been a few years. Odd to me that there wasn't a provision to account for candidate's skills to grow over time. Got plenty of interviews anyways >.>

I used Triplebyte successfully for recruiting as a manager at two companies, and was sad to watch it decay and die while working at a third :(

I hired some of the best people I've ever worked with on Triplebyte.


So I founded a couple hiring-related businesses a year or so before TripleByte. I eventually decided to get out of the space for reasons that are touched on in this article, but I would sum that up even more bluntly:

Hiring is a market for lemons.

The really good employees are almost never on the market [1]. They'll get one job, stay for years, and then if conditions change so that that job no longer fits their needs, they have enough of a network that they'll get snapped up by some other company in a jiffy. The folks that do tend to populate the open job market often have one or more dysfunctions that make it hard for them to slot into the corporate job market, the most common of which are not wanting to slot into the corporate job market and not being able to define yourself enough to identify where you should slot into it. Software can't fix this; it's all about making internal emotional choices and then doing what you need to do to make this apparent to other people.

As a result, successful hiring strategies often don't look like hiring strategies. They include things like:

  1. Internships
  2. Contractor conversions and contract-to-hire roles
  3. Sponsored conferences and competitions
  4. Employee networking
  5. Open-source projects
  6. Paying more
  7. Being the only one left hiring in an economic downturn
  8. Investing in retention
  9. Ex-employee outreach
This article seems to make a mistake I made early in my entrepreneurial career, of misunderstanding your competition. The competition for a hiring startup is not resumes, recruiters, and job boards; these are generally terrible ways of finding new employees. It's personal references or open-source projects, which are the actual good ways of identifying new talent that isn't already snapped up by an employer.

[1] https://www.joelonsoftware.com/2006/09/06/finding-great-deve...


It's a market for lemons but it's also an incredibly inefficient market for lemons. It's amazing just how much gold is randomly lying on the ground on the lemon market you can just occasionally pick up.

The classic story is some engineer who got picked up by a company when they were 22, did well and got promoted a few times and then runs into a terrible boss that makes their tenure untenable at 29 and goes out onto the market any interviews exactly like a 22 year old again by just shooting off random resumes from LinkedIn. I've had to slap so many of my tech friends on the head and be like, no, look, there's a so much more pleasant path you can now go down.

But like, why wouldn't it be that way? The median number of hours any engineer spends thinking about how to get hired in any one year is zero, there's an extreme minority who love the system and think about it all the time and they all talk to each other so they think everyone else is an idiot but the majority of good engineers spent most of their time getting good at engineering, not being hired.

I have an alternative theory which I call the "5th most important problem syndrome". At any company/department/team/individual, we have all the aspirations in the world but we really only have the bandwidth to work on our 1st & 2nd and maaaaybe 3rd most important problems. Getting better at hiring usually slots somewhere around the 5th most important problem which is a dangerous space to be because we oh so aspirationally want to get around to it this quarter but life inconveniently gets in the way so it gets pushed onto the backburner where it will forever be the 5th most important problem.

Startups have to be super careful to make sure they're not solving the 5th most important problem in their user's lives because the initial feedback will be overwhelming that this is definitely one of the top 10 most important problems the company/user faces and everyone is gung ho about fixing it but every time it's time to make forward progress, some "unexpected" emergency appears that derails the project.

Working on technical debt is a classic 5th most important problem which is why it's a perennial topic for debate in engineering circles. Startups that aim to optimize your cloud cost, IMHO, also run into this quite a bit. I've seen so many of them come and go throughout the years and my conversations with them tend to all follow this same trajectory.


So, what's the more pleasant path? I'm well more than 7 years in at this point. And I've spent precious little time in the job market at large. But if my current employer had a problem, it's occurred to me that I don't really know how to find a job. What do people do?


I would assume the answer is 'just network bro', as if that's supposed to actually be helpful.


> The folks that do tend to populate the open job market often have one or more dysfunctions that make it hard for them to slot into the corporate job market, the most common of which are not wanting to slot into the corporate job market and not being able to define yourself enough to identify where you should slot into it.

I think this is a pretty good reason to have an org that "speaks corporate job market", but that doesn't evaluate candidates the same way the corporate job market does. A person who is skilled but doesn't want to/doesn't know how to/has moral objections to selling themselves the way employers often want benefits a lot from having an agent who can vouch for them.

As for the market-for-lemons thing, yeah, absolutely! It's a good framing. And in that analogy, I'm betting on the idea that a team of mechanics that can identify the good cars is valuable.


IIRC TripleByte was supposed to be that org - I thought I remember some of their framing being that they'd be like a Hollywood talent agency for developers.

The big problem with the model is again misunderstanding your competition and how the industry actually works. The market for skilled developers already functions largely like this. The "talent agents" are engineering managers, both on the sell-side (through references, and bringing over good engineers they've worked with when they join a new company) and on the buy-side (through identifying and recruiting good talent that may not be trying hard to sell themselves). This is a core EM skill - many teams will live or die by how well their managers can identify actually-talented devs, even ones who may not be slick or polished, and bring them onto the team. It's also the EM's responsibility to understand the recruitment process at their company well enough to shepherd candidates who won't look impressive to recruiters through it.


< The folks that do tend to populate the open job market often have one or more dysfunctions that make it hard for them to slot into the corporate job market,

I think the set of people employed at corporations provides overwhelming evidence against this.


I'm including small corporations and startups in this, not just the faceless megacorp. You still have an employer and it's still a corporation even if it's just a half dozen guys who all enjoy each others' company, code up some great stuff, and go out for beers together after the workday.

You have to know yourself well enough to understand what type of environment you want to work in. Some people perform better in startups; some in big multinational corps; some in small mom & pop shops. Some would even do better as independent contractors, although if you're smart you're still an employee of an LLC that you own to shield your professional liability from your personal assets. The dysfunction I'm referring to is in not making a choice, or not having enough definition of the types of work available to understand that you could be very happy at some places but miserable at others.


> You have to know yourself well enough to understand what type of environment you want to work in. Some people perform better in startups; some in big multinational corps; some in small mom & pop shops.

This really is a huge factor, and one that is REALLY personal to me. I went from "can barely get out of bed to do the small amount of part-time work I could get" to "enthusiastically hammering out 50+ hour weeks" once I got into the startup environment, which was stunning to me. If I hadn't stumbled into that world, I'd never have known the version of me that exists today was even possible.


Thanks for clarifying :)


Maybe we can add: 1. going remote and support remote; 2. set up offshore offices.


TripleByte worked amazingly well for me. I even use my TripleByte backpack every day on my way to work.

Back in 2018 I was a new grad in Pennsylvania trying to move to the Bay Area. Attempting the process of interviewing, scheduling everything so I can fly out and do on-sites, booking a hotel in a new city for the first time etc. would have been overwhelming. But the whole process was taken care of for me. I got 5 interviews scheduled across Monday through Friday. The hotel was paid for. Uber rides to and from offices were paid for. My flight was paid for. I ended the week with 2 offers. I got them to fight over me and then picked the job with the clearly superior work culture.

Since then I've worked at seed stage companies, Google, and moved around the bay a bit. It would have been possible without TripleByte, but the value prop as a candidate was tremendous. The employer I joined continued to use TripleByte, almost exclusively, until we got an in-house recruiter. Even then we still used it as a solid side channel. I think 3 or 4 of the engineers that joined were TripleByte candidates. They really were excellent at digging up people who wouldn't have seemed like the right pick just given a resume, but in fact were perfect fits once settled in.

The only problem, now that I've got some experience under my belt, is that my professional network makes something like TripleByte unnecessary (unless I move to a new area, in which case I could use help finding a good job). But once we're hiring some full stack web devs at work I'll take a serious look at OtherBranch.


I got a great job through Triplebyte, which let me skip a technical leetcode interview I definitely would have failed. Very grateful for them.


I'm the founder of a new tech assessment co - I hear "you guys remind me of Triplebyte" at least 1x per week. Clearly they were onto something originally - there's a lot of lingering love among eng leaders.

Our thesis is that LLMs unlock a lot in this space - and that we can provide more signal to employers, while giving candidates a better experience. There's a lot of open/difficult questions to doing this well - we're trying to figure it out.

(edit: we're not building an "AI recruiter" that asks you a time you failed or automates hiring decisions - we are extensively using LLMs to do things like problem/module generation, etc.)

I'd love to better understand the Triplebyte story. If you enjoyed their product (as a hiring manager, or as a candidate) or if you feel passionately about this space, I'd love to talk to you. Email is in my bio.


> LLMs unlock a lot in this space

I would NOPE out of any LLM-conducted or LLM-assisted interview so fast, the LLM's head would spin. I do the same for take-homes or any other kind of interview where the company is investing less of their time than I am. Either we are evaluating each other fairly and equally, or GTFO


Feels very similar to when I took a test that asked me questions about my life and then I was auto-denied for a role. Very qualified, I knew many people at the org, ~$150k/yr or something. The recruiter emailed me later that day saying I actually could continue on with the interview process, but I declined.


There's a lot of "AI Recruiters", etc floating around asking you about a time you failed, etc - we don't like that approach at all.

We're (1) only running technical (coding) assessments, and (2) still letting live evaluators making final hiring decisions


LLMs + hiring is a very very risky combination.


One that we have to tread carefully!

I should have clarified - we're not just putting an LLM on the other side of the candidate and letting it drive decisions.

Instead think of things use cases like content generation (we don't have a problem library - we create custom problems/modules for each customer of ours), etc. That's where I think you can improve signal a lot, by setting up a better situation to assess the candidate.


By content generation, do you mean take home problems or interview questions?


Both! Generally more of the former.


there is also the user expereience of feeling like you've been rejected at scale.

you're used to getting rejections from applications you send out, but getting a rejection (or even just being slow at getting an offer) makes it feel like the entire industry has turned you down


It's too bad they didn't make it they had quite a bit of name recognition especially in SV.

But basically it's a tarpit idea (TM YC) where everyone passionately agrees that hiring is broken but there's no straightforward alternatives. It's basically a services business disguised as a tech play that doesn't scale, an attempt to disrupt a high touch sales process (and there's a reason that these processes run the way they do even though they are subpar).

Also specifically with hiring it's extremely fragmented and every company has their own internal method and preference - or rather voodoo magic they like to use.


Would be interesting to hear Rachel's take on all the contract recruiting (?) firms that claim to do the same thing, test candidates and then provide them to employers- just on remote contracts for temp work. The OG in this space is Toptal, but there seems to be a lot of VC money in a bunch of upstarts in this space. They all say 'we screen our developers through an extremely rigorous yada yada process'. They tend to have a lot of developing world talent. Not sure if that's the future of online talent platforms or what have you


I don't have special knowledge of Toptal's screening process. But when I look at their pitch (linked below), what sticks out at me is the lack of specifics.

-----

(From https://www.toptal.com/top-3-percent)

> We also test each applicant's technical knowledge and problem-solving ability through various assessments.

"Various assessments" is about as maximally vague as you can be. Imagine if I asked you how you'd build a back end and you went "oh, you know, various technologies, um, APIs, probably some functions" - that's the level we're operating on here.

Here's how you do it not vaguely: Otherbranch's assessment is, currently, a 90 minute call with an engineer. It's synchronous over video and screen-share and split into 3 subsections: coding, knowledge, and system design. Coding is a small app in the console, you can find an example problem at https://www.otherbranch.com/practice-coding-problem . Knowledge is a mix of CS/algos, full-stack web development, and "deep tech" (systems-level/security/internals). System design is really standard "build this basic business system and scale it up". If you want more depth on our grading, I can show you that, although I don't share our actual questions verbatim. I did the structure and I've led large-scale assessments before, and our interviewers have done thousands of hours of interviews and several are ex-FAANG.

Now, granted, we're in one domain and they're in many, so specifics may be hard to produce here. (And they may even be bad sales practice, so there might be non-malicious reasons for not including them.) But it still catches my eye.

> we typically only advance candidates with exceptional results in this phase.

26.4% of candidates pass basic English screening, according to the numbers next to the previous claim. Of those, they claim, 7.4% pass this step. Note that no screenings for actual skills have occurred prior to this step. If we assume that domain skill and English screening are weakly correlated (this is probably a bad assumption, literally any two skills anywhere have positive correlations), they're passing roughly 25% of applicants by skill. Let's generously say 15-20%. And the 80-85th percentile of applicant is horrible. So I just don't buy "exceptional results" here.

> Each candidate is interviewed by Toptal screeners who are experts in their functional domain.

Again, vague. Interviewed for how long? With what criteria? How are you determining that they are "experts in their functional domain"?

> Each candidate is assigned a test project to evaluate whether they can "walk the walk." Test projects take 1-3 weeks are comprehensive and provide real-world scenarios for candidates to demonstrate their competence, thoroughness, professionalism, and integrity.

This part looks better! Until you look at the numbers next to it, which indicate a 90+% pass rate for this phase (and still a roughly 10% pass rate from "people who cleared basic English language skills".

It's easy to say you're rigorously assessing people. But when they're so evasive about the details, I get suspicious.

-----

I'm not saying Toptal isn't better than nothing. I'm not an expert on international hiring, and they seem to be doing pretty well as a business. And this vagueness may very well be a feature, not a bug - my verbosity gets me into trouble sometimes when I write copy. In fact, this very post demonstrates a good reason NOT to be specific, because armchair experts will pick your exact numbers apart without the appropriate context. (I justify myself by saying that this is, in fact, my area of expertise.) But if you want my snap take, there it is.

Toptal people reading this thread, feel free to drop in with details if you got em!


I used FastTrack in the early days and loved the product.

Problem was their question bank was too small so it was easy for anyone motivated to cheat. I spent under an hour reading reviews of what questions were asked, prepared solutions for those questions, and then in the interview there were no surprises. I merely regurgitated my prepared solutions. One of them was making a tic tac toe game.

If employers knew how easy it was to pass the Triplebyte exam then I wonder how many would have allowed candidates to skip to onsite.


This was an issue, but less of one than you'd probably think. I estimate that roughly 10% of people we vouched for were cheating. 10% is certainly not zero, and it was occasionally trouble with clients, but 10% also kind of disappears into the general noise of different hiring processes and steps to the point that that estimate could be off by a factor of 2 or 3 in either direction.

There's also the sort of sad statement that even prepping that much puts you far ahead of the average candidate. We TOLD candidates what to expect (and we do at Otherbranch, too) and the overwhelming majority do not even read that, which requires no active searching, no moral compromise, and no pre-knowledge of our process.

Triplebyte did, late in the history of FastTrack, start rotating problems a bit to address that problem.


I remember taking the quick for this. Passing and then I scheduled the interview.

The first question was to code an app just like excel. I totally blanked. It was my first experience ever coding live with an interviewer watching me. When I say I blanked I mean I could not even remember how to code a class.

Was a unique experience and I definitely learned a lot from it. The interviewer was great and very understanding. Made me appreciate how hard a tech interview could be. Interviewer was a good dude though, hope he is doing well.


The problem was standardized hiring is it become too easy to game the process. You end up with a pool of engineers good at studying and it doesn’t always translate to good engineering.


> The idea was this: if higher-education and prestigious experience weren't reliable sources of quality, and if you could make an ML model that could identify it better through testing

The problem is that in reality those two sources of data are really reliable. YC itself is mostly composed of people (over 90%) from prestigious schools and organizations.

It’s rare to see anyone accepted to YC, who was from a mediocre or unknown institution for both education and work experience.


Not all (or most) people-problems can be solved with software. Recruiting/hiring is a pure people-problem. No two people are the same. No two groups of people (i.e., companies) are the same. It's a "problem" that everyone has felt, but it honestly seems like an unsolvable problem... and "problem" mostly likely isn't even the right word - more of an uncomfortable reality.


> (I won’t talk about Magnet much here, because the proximate reasons for its failure are pretty simple: we ran out of money and the market fell out from under it. It might have - smart money says probably did have - other means of failure, but we didn’t last long enough to find them in any great detail.)

This has the same voice as patio11. Did he start this style? Is it new? Who else talks like this?


I liked this piece - excellent high-level thinking through the available market opportunities, especially given the constraint of venture capital. And the moral framework is down-to-earth: nothing wrong with ambition and risk even when things go south, paired with a buck-stops-here attitude.

I'd bet good money OtherBranch will succeed.


Rachofsunshine: for what it's worth, I appreciate that there are people who insist on trying to change the market, rather than doing what the market wants.

I did some early interviews with TripleByte and appreciated they were trying something different.

It's just a different kind of effort and it's hard to change people's minds with just money. Often, a grassroots movement arises that slowly changes consensus, over years, and only then some well-funded player can come in, see which way the trends are going, and capture the value. My current startup is so remote we had to move Friday meetings to Thursday to accommodate Japan and Oceania. In my experience the Silicon Valley practices of the 2000s and 2010s already seem antique.

So maybe TripleByte they were just early? Or as the post mentions, maybe the funding environment was wrong.

We are often told that a loose funding environment encourages new ideas to prosper. But whatever happened in the 2010s with ZIRP seemed to concentrate wealth for people who had already gotten through various gatekeepers. I am sort of in the Silicon Valley ecosystem and sort of not, working remotely for American companies, so this stuff just feels like a distant thunderstorm to me. I can see new graduates getting my salary or higher for doing very basic, mistake-laden work, and I can't figure it out. But this helps put it in context.

Anyway, perhaps a downturn is a better time to start trying to change minds, although that's going to be a very long battle and the TAM is going to be "???" for quite a while.


What I remember of Triplebyte was, out of nowhere, omnipresent and inauthentic advertising that made it very clear someone was spending a lot of money to target me.


What did triplebyte do that interview.io doesn’t? (Haven’t used either, but one is still around and the other sounds like a major loss to society)


They lied about the core value prop. Their qualification process was nothing more than another entry point to the standard interviewing regime.


Why does there have to be a reason? Some things just fail and there is nothing we can do about it.


It was a nice jacket they were giving out though.


Yet another Triplebyte placement chiming in. The FastTrack product worked incredibly well for me in 2018. At the time, I had just over a year of full-time SWE experience but was struggling to get any traction, being still relatively-junior and without a degree; my open-source work wasn't cutting it. Meanwhile, the '90s legacy tech company I had ended up with was proving to be a professional dead-end.

I took Triplebyte's test, did their interview, apparently did pretty well (at that point in time, they weren't showing candidates their own scoring) and over a period of about 2 weeks chatted with 14(???) different companies. I narrowed it down to three onsites and ended up with two strong offers, one of which I took.

I'm no longer in that role, but I now work as an infosec-focused SWE at a prominent Bay Area fintech. I get to lead work I care about, and get compensated in a way that I couldn't have dreamed of ten years ago. Triplebyte made that possible for me in a very concrete and permanent way. My resume in 2018 was the kind of thing a lot of hiring managers would've tossed; not any more.

The loyalty I feel towards a company that's no longer around makes me wonder if Triplebyte could've worked on a longer timescale. I'm now the kind of candidate they were trying to acquire via Screen, and if they still existed, they'd have a very good shot at recruiting me on behalf of their clients.


> The loyalty I feel towards a company that's no longer around makes me wonder if Triplebyte could've worked on a longer timescale.

Yeah, I started a company basically because I got tired of wondering about that.


I'm hoping someone will humor me:

At the beginning the author states the "shorter version" is at https://www.otherbranch.com/blog/rebooting-something-like-tr...

Then in this shorter article:

> But maybe we can fork it. (In the Git sense, not that one!)

With a wikipedia link to the Good Place TV show: https://en.wikipedia.org/wiki/The_Good_Place

I don't get it though, what does The Good Place have to do with any kind of 'forking'?

@rachofsunshine: what the heck?!


In the show, the characters can’t swear (non-spoiler: because they’re in the Good Place, and nobody can swear there) so they say “fork” and “shirt” and so forth.


Ah, got it. Thank you kindly, philsnow.


There's a funny contrast between the beginning where they want a meritocracy of candidates and the end, where they are only interested in Americans from prestigious universities.


Interesting you should say that. I'm neither American nor from a prestigious university, though I had a lot of success with Triplebyte towards the end (as a candidate)


I've been at this business for a few minutes and can say it's not that hard to identify good candidates.

1) a native and active interest in computing 2) good english 3) good (enough) social skills

A good software team is built like a good NBA team:

1) a limited number of exceptional performance contributors, not many of these, by definition. Right tail of the normal curve. 2) Good support players with specific skills and the personality to allow your superstars to maximize their potential.

Everyone seems to be looking for 1) in the interview process because that's the gold, but 2) are extremely important to get 1) in the door and working at optimal.

Finally, you need to avoid

3) frictional team members, i.e. negative work. These are people who inject drama into the team process that does not contribute to outcome. A passionate team member who takes hard stances at times is a good thing, even if they're not always correct, because they can head off a bad outcome early in the process. However, someone who just causes drama and turmoil, or even simply breaks the team flow state, is also a big problem.

A team can easily be spoiled by 3) even when you have the perfect superstar and supporting cast because the focus turns from the product to the emotional drama, which is a huge drain.

This leads to a tangential, but important point, the current hiring mandates for modern practice are potentially catastrophic for software development teams. Inserting people onto a team who become 3) will kill a product and drive away your highest value contributors. Superstars in this field don't want drama, they tend to be introverts who are happiest quietly working on their own.

It then follows that meeting modern day hiring protocols often means pushing the equity component into the soft skills side of the company, which means your hiring pool for marketing, hr, project management is reduced to fulfill various hiring mandates. This leaves a lot of experienced, valuable people, who understand good software development because they've done it, on the sidelines because they have the wrong physical characteristics, e.g. the 60 year old who is pushed out of the tech side because of an outdated skill set would probably make an excellent project manager because they understand the SDLC from a technical and personal perspective. Who better to mentor young developers?

But those roles are often going to young college graduates who don't have that experience and just end up glamourized secretaries because they don't have the formative passion for the field.

The NBA superstar/support model works in software development. It's the only thing that creates high quality, high impact outcomes historically. It's the desire to reconfigure the schema into a social change experiment via mandated hiring that causes the complications.


I am not very familiar with Triplebyte but I really appreciated this (seemingly) candid yet lucid writeup about how it was all went down from the inside.

These kind of insider perspectives, even if they come with their own biases, are very interesting nevertheless.

I wish I would see these types of articles more often.


what I haven't seen mentioned in the comments is that Triplebyte had so much funding to just have a database of 250k users / engineers.

Yeah, at that scale bootstrapping would've been the way to go.


Go look at the resumes of everyone in a leadership position at that company. Find me the person who has ever run a recruiting desk. You can't do it. Every engineer has an inner know-it-all recruiter who just instinctively knows not only how to recruit, but how to run a successful recruiting business.

Which is why almost none of them have ever been successful at building a recruiting company. Because recruiting is a people problem, not a technical problem, and people problems are vastly harder to scale.

If you don't believe me, go talk to third party technical recruiting firms and ask them what the backgrounds of their top contributors are. Unless they're being spoonfed with a sweetheart deal from a former employer, virtually all of these people can barely operate their own PCs, much less know anything about engineering. I used to be a recruiter. The three most successful third party technical recruiters I know (with incomes in the 300k+ range in Texas, not SV) are a high school dropout, a college dropout, and a theater major. These people know how to communicate with other people, are as nice as can be over the phone, and they will fight you over a stray nickle.

Being great at recruiting is about being great at building relationships. Building relationships is built upon trust. Clients need to trust that the recruiter can find candidates and convince them to sign on when the time comes. Candidates need to trust that the recruiter is motivated to find them a position. As long as the effort is there, the better the recruiter balances that trust factor, the more money they make.

Triplebyte sucked at the people game. They sprang public opt-out on people and couldn't figure out why that would be such a big deal to potential job seekers. They assumed that their process was so great that experienced devs who had jobs and lives and lifestyles and families would just jump at the chance for impersonal coding evaluations. They started a company with no idea how to recruit and hired people to build a recruiting product who had no background in recruiting. And no, founding a company does not give one recruiting expertise, even if they're doing the hiring. It is both a talent and a skill.

Oh, and here's a fun one: they failed to recognize that along with being a people game, recruiting is a numbers game. Being highly selective in your pool of candidates is a genius way to run out of money as a recruiting firm, especially when you've already chosen to place numerous self-own barriers in front of you.

The average third party technical recruiter with a year of experience actively maintains and prunes their own database of 3-400 candidates. More than that is too many people to manage. You can either pay a bunch of engineers to attempt to automate a small part of their process, or you can simply hire people to do the work and pay them according to what they produce.


TL;DR: Publicly facing profiles, with an opt-out instead of an opt-in, was a footgun moment.


Yup. Easy to destroy trust.


It absolutely is, and it's something I am very much aware of.

I considered including something about that incident (for those of you coming in with less context, they're referring to [1] this) in this article, but I didn't for two reasons.

One, I sincerely do not think that was actually fatal (it was, at worst, a symptom of the same forces that forced a pivot). And two, the best thing I could say about it in terms of going forward is "no but seriously I'm not going to do that", which _I_ know is true but you obviously don't (and if you trust me when I say that, well, I don't think you're cynical enough). Every time I tried to write about it in the early draft versions of this post, I found myself coming back to saying both "you're not cynical enough and you shouldn't trust a damn thing I say if it doesn't cost me something" and "but look how I'm actually different". And that just seemed wrong to write.

It's hard to be an honest cynic in a world of distrust. Because I want you to trust me, because I think I am worthy of that trust, but I also think the average person is nowhere near cynical enough about how easy it is to fake sincerity and want to scream that fact from the rooftop.

----

[1] https://news.ycombinator.com/item?id=23279837


The fact that you chose not to address this incident is exactly everything we need to know about why Triplebyte failed.


That doesn't make any sense.

How could addressing that single incident help explain Triplebyte's failure to find product-market fit?

Privacy violations sure haven't stopped any other successful corporation I can think of from finding product-market fit.


Their product was a two sided market. Two sided markets are always hard, but it doesn’t get any easier if one of the sides (job candidates) loses trust in the platform. Glassdoor is finding out the same thing.

If you’re looking for examples where privacy violations hurt a company’s chances for success, look at Instagram for Kids. It was cancelled largely because the kinds of parents who would even consider such an app don’t trust Meta.


What does it tell us exactly?


Because 'this dev is 3/5 on our Javascript assessment' is useless information, and it subtracted value from everyone that interacted with it


I always find it strange that people who talk about tech interviewing inexplicably overlook what seems to me to be a core defining characteristic: they are highly traumatic. You take some poor bastard and have him struggle at coding puzzles in front of someone he very much wants to impress and then watch as he fails miserably. They are left feeling like they are biologically inadequate to their job. It's a direct assault on egalitarian sentiment - a load bearing pillar of civilization - even if it is more or less a noble lie.

By definition, they only take the top 1%, and 99% of people get to eat shit. Inspiring existential resentment in the vast majority of people who interact with you is obviously not a recipe for good karma.


Stress I'll grant you. But what's your alternative to selectiveness? Is your expectation that companies, who are spending a substantial amount of money to hire someone, should not try to hire the best person they can get for their buck?

Like, I'm enough of a leftie to agree with the idea that one's ability to contribute in the workplace shouldn't determine your ability to live a decent life. But that doesn't mean companies should hire someone who can't do the job, it means being unemployed shouldn't be a virtual death sentence, which isn't fundamentally a problem of hiring processes.


I do not have an alternative and agree that this type of screening is necessary.

I just think the fundamental problem here is not procedural as the post seems to suggest - but rather social-psychological. Making the experience less painful to the losers is the key problem to solve.

That would fix the candidate pipeline problem because people would be less terrified of failure.

I don't know how to solve it.

To quote Leonard Cohen:

  It's coming from the sorrow in the street
  The holy places where the races meet
  From the homicidal bitchin'
  That goes down in every kitchen
  To determine who will serve and who will eat
I do not envy anyone in the position of making this determination!


Yeah, the psychology of it is rough. Mental health is something I care a lot about, because in another life it damn near killed me [1], and it's something I plan to write a lot about at least. I'm not sure I have much of a concrete solution, though.

[1] https://news.ycombinator.com/item?id=40493572


I remember once I worked at a company that was being acquired by another. As part of the screening we all had to go over to the other company to do an algorithms interview. Everyone - including my boss. Our company had a pretty softball interview process and most of our engineers hadn't been through a real gauntlet before.

I knew what to expect, had practiced these things, and made it through. I tried to warn my colleagues that they were in for something a lot more difficult than what they were expecting. But yeah, as you might expect, almost everyone failed.

I remember my boss talking about it, shaking his head slightly, his mouth screwing up into that familiar chagrined smirk so many people get after performing poorly at these things. I told him that these technical interviews are purposely difficult, that most people fail. That it's much better for them to miss a good candidate than hire a bad one. That failing is normal. I could see some of the tension in him subside after I said that. I repeated, "It's normal." He calmed down some more. The word "normal" seemed to help a lot.

I wonder if bringing in elements from psychotherapy might help a surprising amount here. I've found that software engineers highly value rational thought - to a point that they neglect the emotional side of things. A little development of their softer side can go a long way.

Like having a pre and post interview counseling session with a therapist would maybe be a little absurd. But maybe something along those lines would work. Maybe GPT4 could do it.


None of this is a surprise to me, but the specific framing of trying to build these things into the interview is really interesting. It's already something I wanted to write about a lot, but it hadn't occurred to me to put it specifically into the process.

Honestly, having mental health people writing some content doesn't sound like a bad idea at all. I'm sure they hear it all the time; work is one of the biggest stressors in most peoples' lives.

I'm gonna think on this.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: