
Byteboard assesses for on-the-job engineering skills - ikarandeep
https://www.blog.google/technology/area-120/byteboard-interview-measures-for-essential-engineering-skills/
======
tptacek
This replaces "pre-onsite interviews"; in other words, it looks to fit into
the same spot as HackerRank --- a hurdle you clear for the privilege of
enduring a company's interview gauntlet.

The serious, playing-to-win version of this approach replaces _most of the on-
site interviews_ , and generates results that are more trustworthy than those
interviews. It takes less time from candidates, reduces inconvenience, and yet
is cheaper and faster to run for the company. I'm still amazed more people
haven't figured this out and run with it.

~~~
opportune
I don't think most companies want to relinquish that kind of control to a
third party, due to the potential for abuse

~~~
komali2
I'm not sure if it's abuse so much as just generally, a hiring manager, i.e. a
team lead or senior engineer, can't confidently say "I know how to interview
people well."

I say this because I think the skillset required to "choose a good engineer
among thousands of applicants" is quite different from "lead a team of
engineers to develop a product."

Some are better than others, but point being, if one can't say "based on
research, this is a good way to interview," why would you risk changing what
seems to work? Why would you risk offloading your "good enough" strategy onto
another company that, to you, probably is just guessing as much as you are?

~~~
stcredzero
_a hiring manager, i.e. a team lead or senior engineer, can 't confidently say
"I know how to interview people well."_

Isn't data a potential solution to this? One issue is that the best data would
have to be gathered on timescales of years, and not just a few years.

~~~
yomly
Call me old fashioned but I'm unconvinced about data's role in defining "good
candidate" although if anyone could do it, Google is the perfect storm of data
know how and applicant pool scale.

Fundamentally it is a human evaluating another human. Process and rubric can
assist in eliminating bias but as human interaction is complex, a bit of
pattern matching and whooly experience is going to go a long way and excels in
these domains.

Let's also not forget that things that interpret data are subject to bias as
they are built by humans too - see Amazon's case where filtering was
accidentally sexist

~~~
JamesBarney
There are mountains of data on interviewing and hiring(see industrial
psychology), and none of it shows that human evaluation is the gold standard.
In fact the research shows that pretty much anything you do to reduce the "je
ne sais quoi" of hiring is going to improve results.

------
civicsquid
The post seems to say a lot about how Byteboard will provide a more
insightful, less biased experience for recruiters through a project-based
interview process. But I don't see any concrete examples of what that looks
like in the post. I looked at the website linked
([https://byteboard.dev](https://byteboard.dev)) and didn't find anything too
specific there either.

Does anyone have an example of what a Byteboard interview looks like, maybe a
sample project? I feel like there are some details missing here about what
makes Byteboard as helpful as the article says it is.

~~~
dang
Yes, the post includes remarkably little detail. TechCrunch has marginally
more: [https://techcrunch.com/2019/07/17/googles-
area-120-launches-...](https://techcrunch.com/2019/07/17/googles-
area-120-launches-byteboard-to-improve-technical-interviews/)

~~~
civicsquid
Nice find, at least there are some screenshots that give us an idea of what
this looks like.

From what I can see, it's a lot like the type of content I've experienced
smaller companies asking, which I'm for because it does require some amount of
open-ended engineering that fits into what applicants would be expected to do
every day. There's a made-up feature that's required, and you discuss how to
implement it (and then actually implement parts of it), that sort of thing.

What I think is still a bit disappointing is that this is being pitched as a
fix for screening, not for the actual on-site interviews where the
memorization questions usually come into play anyways. It feels a bit
disingenuous and contradictory.

------
singron
> Byteboard evaluators—software engineers with up to 15+ years of experience

So evaluators have literally any level of experience? Can we, as a society,
please agree to stop using this actually meaningless "up to X or more"
language?

I get that they want us to have the impression that the evaluators have
experience, but it seems like they weren't confident enough to commit to any
real statement to the effect. E.g. most evaluators have at least 10 years
experience. Even "at least 1 evaluator has 15 years of experience" would be a
stronger statement, although it would be more obviously empty.

~~~
stronglikedan
Hell, a person's first day on the job nets them up to 15+ years of experience!

------
abvr
As always, anchoring the already broken interview system by introducing even
more wacky gimmicks to the process. Let's hope this one stands to be a
refreshing sign that they are looking for ways to both improve the process and
their own stand in the industry when it comes to recruitment, as giants like
them seem to lead the way when it comes to this process.

However, until they actually test it out for themselves, both they and the
talent pool are not going to have a clear idea on whether such practices work
well to improve the hiring process and hence source the right people.

But whatever maybe interviewing and the process of recruitment is bound to
have some false positives/negatives which are completely fine given the fact
that the pool is humungous and that the candidate may in the future seek to
establish themselves by improving upon their lack of skill by filling in their
knowledge gaps and the companies should provide adequate support and morale
boosts for such employees, those who strive to achieve better knowledge and
skillset given enough time and backing.

~~~
rvz
Well after looking at yet another attempt at solving the 'broken hiring
process', it seems like this one is going to end up like the same situation as
this relevant xkcd comic: [https://xkcd.com/927/](https://xkcd.com/927/)

Solving this hiring problem is probably more tougher than the leetcode puzzles
that the candidates get. Surprisingly, not even Google can solve this.

------
tylerlh
I don't follow. This is developed and being run by Google, but it doesn't seem
Google is attempting to use this to improve their own interviewing processes.
Is it implied in this blog post somewhere? I would be really excited to see
this in practice.

~~~
opportune
I think they are A/B testing it internally.

~~~
tylerlh
Probably worth holding out on them until the "productized" version then. :)

------
heyheyhey
Looks like this mainly replaces the pre-onsite or phone screen interviews?

Sure, it makes things easier for the interviewer since they'll get better
candidates but does this change anything for the interviewee?

They'll still have "to find time to comb through my college computer science
books, practice coding theory problems like implementing linked lists or
traversing a graph, and be prepared to showcase this knowledge on a
whiteboard." All that is up for game during the brutal onsite interviews.

~~~
voodootrucker
The really frustrating part is that linked lists are actually bad practices in
most instances nowadays [0]. And skiplists are better than trees for many in-
memory uses [1].

Not that you would usually implement such a thing yourself anyway. So these
leetcode style interviews are actively selecting for people who practice 40
year old CS solutions that don't work well on modern hardware, and selecting
against folks with real-world experience.

[0] [https://www.youtube.com/watch?v=YQs6IC-
vgmo](https://www.youtube.com/watch?v=YQs6IC-vgmo) [1]
[https://ticki.github.io/blog/skip-lists-done-
right/](https://ticki.github.io/blog/skip-lists-done-right/)

~~~
ben509
> The really frustrating part is that linked lists are actually bad practices
> in most instances nowadays

"Fancy algorithms are slow when n is small, and n is usually small." \--
Robert Pike

Cons-based linked lists continue to be the container workhorse of MLs. In
imperative languages, arrays are preferred because allocating a chunk of
memory is even simpler.

> So these leetcode style interviews are actively selecting for people who
> practice 40 year old CS solutions that don't work well on modern hardware

These claims are extremely common, so I'm not being snarky here, I want to
understand your reasoning.

You're doing the interview, they're gathering data, then discussing it in a
meeting:

What do you think is being measured?

Can you describe a utility function they are applying to that measurement?

> Not that you would usually implement such a thing yourself anyway.

But engineering isn't like shift work. The complexity of tasks you encounter
are likely Pareto distributed. Thus something like 20% of the tasks you work
on will justify 80% of your pay.

~~~
voodootrucker
> What do you think is being measured?

In a leetcode style interview? I think they are attempting to measure aptitude
at solving algorithmic-style problems on a white board. I think they are
indirectly measuring ability to perform under pressure, ability to vocalize
while thinking, ability to function in unnatural environments, and either time
since graduation from a CS major, or time spent practicing / gaming the system
on leetcode (the premium feature even lets you see which questions your
employer will likely ask).

I think they should be measuring things like: ability to select good
frameworks for a solution, ability to reason about how such frameworks are
implemented at an algorithmic level, when and where to add an index or avoid a
shuffle, how to debug, ability to write readable code, ability to coach more
JR devs, ability to learn from more SR devs, etc.

> Can you describe a utility function they are applying to that measurement?

I'm not sure I can. I see this as a very multi-dimensional problem, so I think
the utility of given skills would have to be considered in regards to the
existing skills within the team.

Honestly, I feel like "software engineer" has been split in two: R&D is done
within the FAANGs (and maybe a few universities) and the results are open-
sourced. Outside the FAANGs, something more like "software development"
happens: FAANG libraries are glued together. _Very_ occasionally, you get a
smaller company that has a legitimate business reason to write their own
library or database or whatever, and an engineer is lucky enough to convince
them of that.

I'm also not trying to be snarky - I've been stuck on the latter side for most
of my career, because I prefer to work in smaller companies, and as such have
not had as much opportunity to practice "interesting code" as I would have
liked. Perhaps this is due to my market (Denver), but I've also worked with
companies in the bay, and I've seen a tremendous amount of over-engineering
when off the shelf parts would have worked.

So I'm legitimately curious:

1\. Am I interpreting your questions correctly in that you think the ds & algo
whiteboard interview selects well for desired attributes?

2\. Do you solve those kinds of problems on a day to day basis?

3\. If so, do you work at a FAANG?

I don't mean to pry, so feel free to disregard any of those questions if your
are uncomfortable answering them on a public forum, or message me privately.
Thanks!

------
vinceve
For software engineers we ask to bring a piece of code with them that they
wrote. It can be anything. At the interview they first show what it does, and
then we go through the code together. We ask some questions about why they
implemented it a certain way. When doing that you can already see how people
think and if they wrote it theirselves. It’s pretty interesting also to see
what keeps other people busy. It removes a part of anxiety because they have
written it themselves.

~~~
vector3-dot-no
> It’s pretty interesting also to see what keeps other people busy.

So do I bring my dog and wife and books and video games and...? Cuz between
the job and that stuff I don't have time (or energy) to make anything at home.

I mean, I've been wanting to learn Rust, and ML, and containers/cloudy
things... I have some ideas for side projects... I just don't have time and
usually want to do the opposite of "think real hard to make good code" in my
off time.

------
gigatexal
Maybe I am just too junior of a programmer ( after all these years... ) to
admit it to myself but I perform so terribly on technical interviews.
Seemingly simple things I blank on. The stress is palpable and I can't
function. I think with time, and practice I'll get better. Hopefully.

------
wgerard
So many feelings on this. My co-founder and I worked on this problem for
awhile ([https://www.headlightlabs.com](https://www.headlightlabs.com)) before
pivoting to something else.

Truthfully, I came away a bit jaded about the whole culture around technical
interviews and whether most companies actually have a genuine desire to change
their interview process.

Still, we both really want to see something like this succeed. My current bet
is still on [https://www.woventeams.com/](https://www.woventeams.com/) but
very interested to follow this as well.

~~~
jordanmorgan10
I guess my hurdle with these kinds of things, whether they are "outsourced" to
a third party or done in-house, is that they tend to still focus on the
"trick", quiz or algorithms parts of engineering. It's a personal bias, as
I've never been particularly strong with that area, but I can build products.
So, for me, I find it hard to have these kinds of interviews. I really do
enjoy "take home" interviews, but it seems that Byteboard is at least trying
to key in on that in same way: "focusing on engineering skills that are
actually used on the job, Byteboard allows candidates to confidently show off
their role-related skills in an environment that is less performative and more
similar to how they typically work as engineers. "

~~~
wgerard
> is that they tend to still focus on the "trick", quiz or algorithms parts of
> engineering.

FWIW, this was exactly what we were _against_ at Headlight, as is Woven and
(presumably) Byteboard as well.

There are plenty of practical interview questions to ask candidates, but it
requires a lot more thought and preparation than I think most companies want
to do.

The reason (imo) that quiz/trick/etc. type questions are popular is that
there's a clear "right answer" most of the time. No nuance required.
Introducing nuance requires humans to reason about things, and that takes up
time (and thus money).

~~~
jordanmorgan10
For sure, there is definitely a valid perspective for both approaches. To be
fair, at the FAANG companies the computer science is probably more important
than other places.

The irony for me is that when I did encounter these questions, I already knew
the answers from research (i.e. the egg drop question, how would build an M&M,
etc).

------
notafrog
Personally my best interview experience has been with Automattic. Having a
text based interview on Slack where I could take the time to collect my
thoughts, removed my anxiety almost completely.

~~~
overthemoon
God, that's exactly my speed. Did you end up getting the job? I have terrible
interview anxiety and that would be ideal for me.

~~~
notafrog
Yes, I did. It's a bit lengthy but most definitely the best process I've been
part of. You will find lots of posts from other people regarding their
interview process. That helped me a lot during the process.

------
wufufufu
This is coming from a company with one of the worst interview/recruiting
experiences. I had to do two separate onsites just to determine what level I
was, then >5 team-matching interviews. I finally went with a different
company.

~~~
decebalus1
Did you at least know if the compensation was ok before going through all of
that crap? I have a friend who went on two separate onsites only to get an
offer which was less than what he was making at his company.

~~~
wufufufu
Google's compensation would be about the same or higher adjusted for COL. I
was honestly willing to accept lower compensation to work at Google. I was
just so frustrated by the end of repeatedly being turned down by EMs that I
went with a company and manager that seemed to value me more.

~~~
papln
What's an EM? Engineering Manager? What was "turned down"? You mean during
team-matching?

You described one (long) interview cycle, with apparently an offer (since you
seem to know the compensation) that you declined.

------
hysan
It's interesting how there are so many different solutions and even entire
companies focused on "fixing" parts of the hiring funnel. Byteboard looks to
be targeting the technical validation part of the funnel which means they
overlap with companies like Triplebyte. However, the approach is vastly
different and seeks to tackle different pain points.

What I wonder though is whose pain points are being alleviated: job seekers or
employers?

While it sounds like this is better for job seekers than technical interviews,
it actually sounds like it will compound the pain. Specifically because this
is yet another method of validation one must pass to even be considered for a
job. Take the various hurdles a typical job seeker must overcome:

1\. _Get through the initial filter._

Referrals, networking, and recruitment companies are potential solutions.
Byteboard just tacks on another item to this list that must be checked off. It
does not replace having to do any of these other things.

2\. _Pass the technical screen._

Crack open those college textbooks, cram leetcode, shore up your portfolio,
and now, also pass the Byteboard test. Unless _every_ company uses Byteboard
_or_ Byteboard guarantees eventually getting you a job, this again doesn't
replace anything.

3\. _Pass the onsite._

Culture fit is culture fit. That one is unavoidable and reasonable. However,
everyone knows that onsites still contain a technical portion. Sometimes a
significant one. Byteboard only replaces the pre-onsite screening portion of
the interview process. That means GOTO Step 2. You still need to study and
pass the onsite technical interview.

The copy reads well initially. Digging deeper though, it starts skewing much
more heavily towards employers:

* employers get a better technical screen (filters better)

* employers save time (no resume filtering + technical screening)

* employers save a lot of time (fewer, higher quality onsites means fewer hours spent by the team interviewing candidates)

Which all leaves me wondering, what is the value proposition to a job seeker?

~~~
jarsin
> what is the value proposition to a job seeker?

The current interview questions from companies are based on tricks (like two
pointers, start at both ends etc) you need to know, memorize, and be very good
at figuring out what trick to use in advance .

As a job seeker that has learned there are tons of tricks and its impossible
to memorize them all. I for one would welcome anything that tries to end that
shit once and for all, which this claims its trying to do.

~~~
hysan
Byteboard doesn't attempt to remove this though. They claim to try and remove
quizzes and tricks while not actually removing them. There are two things that
must happen for what you say to become true:

1\. all companies use Byteboard

2\. Byteboard replaces the onsite technical portion as well

If you look at Byteboard's description of their own value proposition
([https://byteboard.dev/how-it-works](https://byteboard.dev/how-it-works)):

> _How does Byteboard fit into our hiring process?_

> The Byteboard interview is designed to replace one or all of your pre-onsite
> technical interviews. It provides you with a strong understanding of your
> candidate across 20+ essential software engineering skills, so you can make
> decisions with confidence about which candidates to bring onsite.

You'll notice that they are only attempting to solve #1. Hence why I spoke
about where in the funnel Byteboard is trying to insert itself. They are not
attempting to solve #2 which means job seekers still have to do everything you
listed. Only now job seekers have to mentally prepare themselves for
potentially going up against Byteboard's black box rubric:

> review each anonymized interview for the presence of 20+ essential software
> engineering skills, which are converted into a skills profile for each
> candidate using clear and well-defined rubrics.

Oh and since one of their value propositions is tailored interviews:

> _end-to-end service_ that includes the development of unique questions, an
> interview platform, candidate support, interview evaluation, and skills
> reports

You don't get the benefit of something like Triplebyte (not that I'm sold on
them either) where you only need to be vetted once.

The top comment by tptacek sums up what needs to happen pretty well. As I see
it now, Byteboard is very clearly marketed towards employers and maybe that is
the best first step towards changing the status quo. However, right now, I do
not see any value add.

~~~
jarsin
All I know is that Google etc basically started this mess and they need to be
the ones that fix it since everyone follows them.

Sadly it's not going to happen over night. I am just glad they are trying and
honestly I don't see how they can come right out and replace the entire
interview process for companies over night.

They need to prove to more and more companies that if people can do this real
world stuff then you should hire them and not focus on trick questions.

I am already thinking I will prefer companies using this new system as long as
they stay true to their word and keep it real world.

------
ronilan
I’ll put it here again specifically because this thing comes from Mountain
View.

 _Employers should be required, by law, to compensate interviewees for their
time at a rate equivalent to the job for which they are interviewing.

(Macdonald’s, Goldman Sachs, line cook, VP legal, same law, everyone,
everywhere)_

~~~
astrea
I truly felt this way after sinking a month into a company with multiple phone
interviews, a HackerRank challenge, full on-site day with more interviews (all
an in-person repeat of the phone ones) and a case study. All for a no after
the in-person day.

------
suyash
Interesting, the day Google starts using this for their own engineering hiring
I will believe it's worth pursuing.

~~~
oarabbus_
I was going to say, Google is a poster child for problematic technical
interviews.

------
sleepysysadmin
When I did interviews, it had nothing to do with the person. I designed a
standard test like for windows admins I would have them dcpromo, setup dns,
setup ip address, install wsus on the alternate port. I only had 5 steps on
the sheet. I was even upfront about that they can ask for any help or google
it because I want to see their troubleshooting process.

The reality is that there's far more steps actually required than that. I want
to see if they were going to go through all the details; ask what all the
details were, etc.

But even more importantly, the entire time I'm talking video games and tv and
movies and cars. I'm trying to distract them from actually doing the steps.
Even better is that as the steps escalate, the more difficult to steps become
until you make it to WSUS and you literally cant install wsus successfully.
Ask Microsoft, not me, why wsus is so terrible.

The fun thing about this interview process is that it's not trivia bullshit.
Like "explain the role of a windows server" or "what is a windows domain" or
"How do you backup active directory" this is all ridiculous questions that
dont gauge if someone knows anything about anything.

------
domrdy
Super interesting product. I wonder how they will tackle the plagiarism
problem that a lot of project-based products in this space have.

I also think there is a lot of unconscious bias in the technical interview in
most companies and not a lot of people like to admit it. We have a similar
take on the problem in our own product: [https://medium.com/codesubmit/were-
all-a-little-biased-a4749...](https://medium.com/codesubmit/were-all-a-little-
biased-a474969e7de8)

The trend is definitely going into project-based assessments, at least here in
Europe. I think most candidates would prefer that kind of assessment if
companies started to invest time developing them. Most of the time they are
either a nonsensical mix of brain-teasers or just take way too long to
complete.

------
supratims
I would personally prefer project based/code based assessments any day. Algo
based tests are indeed pointless and glad other people also see the point.

------
jorblumesea
I think AMZN and many others already do this. Pre-onsite
leetcode/hackerrank/algo grind.

If I'm understanding this correctly, hardly anything new in the space.

------
eli_gottlieb
How is this different from TripleByte, except maybe that it actually requires
more of your time?

------
nfRfqX5n
amazon seems to be changing thing stoo with "shortened hiring events". haven't
tried it yet but it seems to address the same issues.

------
jarsin
Any plans to have practice ByteBoards?

------
TrinaryWorksToo
Does this compete with Triplebyte?

