Hacker News new | past | comments | ask | show | jobs | submit login

> if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

This is why I've always been so confused. Why is the software engineering interview wildly different from the traditional engineering interview (seniors sit down with candidates and discuss how to solve a relevant proxy problem the team is currently undergoing (has the side benefit of making the interview process potentially fruitful if you don't go with that candidate. This can be (and sometimes is) abused though)). I mean... we all speak the same language, and that isn't standard English... right?




> Why is the software engineering interview wildly different from the traditional engineering interview

I have my personal theory.

1) Top companies receive way more applications than the positions they have open. Thus they standardised around very technical interview as ways to eliminate false positives. I think these companies know this method produces several false negatives, but the ratio between those (eliminating candidates that wouldn't make it and missing on great candidates) is wide enough that it's fine. It does leads to some absurd results (such as people interviewed to maintain a library not being qualified despite being the very author) though.

2) Most of these top companies grew at such rates and hiring so aggressively from top colleges that eventually the interview was built by somewhat fresh grads for other fresh grads.

3) Many companies thought that replicating the brilliant results of these unicorns implied copying them. So you get OKR non sense or such interviews.


Yup. And 3) is particularly interesting. Lots of companies actually need to hire people who can get things done and who can build user-friendly software, yet they thought they needed to hire people who could turn any O(N^2) algorithms into O(N) or O(Nlog(N)).

And even for Google, leetcode has become noise because people simply cram them. When Microsoft started to use leetcode-style interviews, there were no interview site, and later there was this Cracking the Interview at most. So, people who aced the interview were either naturally talented or were so geeky that they devour math and puzzle books. Unfortunately, we have lost such signals nowadays.


> yet they thought they needed to hire people who could turn any O(N^2) algorithms into O(N) or O(Nlog(N))

And the great irony is that most software is slow as shit and resource intensive. Because yeah, knowing worst case performance is good to know, but what about mean? Or what you expect users to be doing? These can completely change the desired algorithm.

But there's the long joke "10 years of hardware advancements have been completely undone by 10 years of advancements in software."

Because people now rely on the hardware for doing things rather than trying to make software more optimal. It amazes me that gaming companies do this! And the root of the issue is trying to push things out quickly and so a lot of software is really just a Lovecraftian monster made of spaghetti and duct tape. And for what? Like Apple released the M4 today and who's going to use that power? Why did it take years for Apple to develop a fucking PDF reader that I can edit documents in? Why is it still a pain to open a PDF on my macbook and edit it on my iPad? Constantly fails and is unreliable, disconnecting despite being <2ft from one another. Why can't I use an iPad Pro as my glorified SSH machine? fuck man, that's why I have a laptop, so I can login to another machine and code there. The other things I need are latex, word, and a browser. I know I'm ranting a bit but I just feel like we in computer science have really lost this hacker mentality that was what made the field so great in the first place (and what brought about so many innovations). It just feels like there's too much momentum now and no one is __allowed__ to innovate.

To bring it back to interviewing signals, I do think the rant kinda relates. Because this same degradation makes it harder to determine in groups when there's so much pressure to be a textbook. But I guess this is why so many ML enthusiasts compare LLMs to humans, because we want humans to be machines.


Many software programs fail to achieve ultimate efficiency either because the software engineers are unable to do so, or because external factors prevent them from achieving it. I believe that in most cases, it is the latter.


I'd like to think the later because it makes us look better but I've seen how a lot of people code... I mean GPT doesn't just produce shit code because it can't reason... It'll only ever be as good as the data it was trained on. I teach and boy can I tell you that people do not sit down and take the time to learn. I guess this is inevitable when there's so much money. But this makes the interview easy, since passion is clear. I can take someone passionate and make them better than me but I can't make someone in it for the money even okay. You're hiring someone long term, so I'd rather someone that's always going to grow rather than someone who will stay static, even if the former is initially worse.

IME the most underrated optimization tool is the delete command. People don't realize that it's something you should frequently do. Delete a function, file, or even a code base. Some things just need to be rewritten. Hell, most things I write are written several times. You do it for an essay or any writing, why is code different?

Yeah, we have "move fast and break things" but we also have "clean up, everybody do their share." If your manager is pushing you around, ignore them. Manage your manager. You clean your room don't you? If most people's code was a house it'd be infested with termites and mold. It's not healthy. It wants to die. Stop trying to resuscitate it and let it die. Give birth to something new and more beautiful.

In part I think managers are to blame because they don't have a good understanding but also engineers are to blame for enabling the behavior and not managing your managers (you need each other, but they need you more).

I'll even note that we jump into huge code bases all the time, especially when starting out. Rewriting is a great way to learn that code! (Be careful pushing upstream though and make sure you communicate!!!) Even if you never push it's often faster in the long run. Sure, you can duct tape shit together but patch work is patch work, not a long term solution (or even moderate).

And dear God, open source developers, take your issues seriously. I know there's a lot of dumb ones, but a lot of people are trying to help and wanting to contribute. Every issue isn't a mark of failure, it's a mark of success because people are using your work. If they're having a hard time understanding the documentation, that's okay, your docs can be improved. If they want to do something your program can't, that's okay and you can admit that and even ask for help (don't fucking tell them it does and move on. No one's code is perfect, and your ego is getting in the way of your ego. You think you're so smart you're preventing yourself from proving how smart you are or getting smarter!). Close stale likely resolved issues (with a message like "reopen if you still have issues") but dear god, don't just respond and close an issue right away. Your users aren't door to door salesmen or Jehovah's Witnesses. A little kindness goes a long way.


> And the great irony is that most software is slow as shit and resource intensive

You really need those 100x faster algorithms when everything is a web or Electron app.


I’d add another factor to #1: this feels objective and unbiased. That’s at least partially true compared with other approaches like the nebulous “culture fit” but that impression is at least in part a blind spot because the people working there are almost certainly the type of people who do well with that style and it can be hard to recognize that other people are uncomfortable with something you find natural.


I would say that it makes the interview process more consistent and documented, and less subject to individual bias. However there's definitely going to be some bias at the institutional level considering that some people are just not good at certain types of interview questions. Algorithm and data structures questions favor people who recently graduated or are good at studying. Behavioral interviews favor people who are good at telling stories. Etc.


Yes, to be clear I’m not saying it’s terrible - only that it’s not as objective as people who like it tend to think. In addition to the bias you mentioned, the big skew is that it selects for people who do well on that kind of question in an unusual environment under stress, which is rarely representative of the actual job. That’s survivable for Google – although their lost decade suggests they shouldn’t be happy with it – but it can be really bad for smaller companies without their inertia.


Yeah I buy this theory.

The problem I have with it is that for this to be a reasonably effective strategy you should change the arbitrary metric every few years because otherwise it is likely to be hacked and has the potential to turn into a negative signal rather than positive. Essentially your false positives can dominate by "studying to the test" rather than "studying".

I'd say the same is true for college admissions too... because let's be honest, I highly doubt a randomly selected high school student is going to be significantly more or less successful than the current process. I'd imagine the simple act of applying is a strong enough natural filter to make this hypothesis much stronger (in practice, but see my prior argument)

People (and machines) are just fucking good at metric hacking. We're all familiar with Goodhart's Law, right?


I think (but cannot prove) that along the way, it was decided to explicitly measure ability to 'study to the test'. My theory goes that certain trendsetting companies decided that ability to 'grind at arbitrary technical thing' measures on-job adaptability. And then many other companies followed suit as a cargo cult thing.

If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation? Surely the skill of programming ability a) varies over an employee's tenure at a firm and b) is a strong predictor of employee impact over the near term. So I surmise that such companies don't believe this, and that therefore LeetCode serves some other purpose, in some semi-deliberate way.


I do code interviews because most candidates cannot declare a class or variable in a programming language of their choice.

I give a very basic business problem with no connection to any useful algorithm, and explicitly state that there are no gotchyas: we know all inputs and here’s what they are.

Almost everyone fails this interview, because somehow there are a lot of smooth tech talkers who couldn’t program to save their lives.


I think I have a much lazier explanation. Leet code style questions were a good way to test expertise in the past. But the same time everyone starts to follow suit the test becomes ineffective. What's the saying? When everyone is talking about a stock, it's time to sell. Same thing.


> If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation?

Probably recent job performance is a stronger predictor of near future job performance.


so having done interviews, just because the latter may be more present, does not mean the hordes of people just throwing a spaghetti made-up-resume at the wall have gone away. our industry has a great strength in that you don't need official credentialing to show that you can do something. at the same time, it is hard to verify what people are saying in their resumes, they might be lying in the worst case but sometimes they legitimately think they are at the level they are interviewing for. it was bad before the interest rate hikes, i cannot imagine what the situation is like now that hiring has significantly slowed and a lot more people are fighting for fewer jobs.

i did interviews for senior engineer and had people fail to find the second biggest number in a list, in a programming language of their own choosing. it had a depressingly high failure rate.


I had a candidate claiming over ten years of experience who couldn’t sum an array of ints in any language of his choosing.

This wasn’t an off-by-one or didn’t handle overflow, but rather was couldn’t get started at all.


Ten years of experience at one of those places where every keystroke outside powerpoint is offshored. Why would they know how to sum ints? Some people do start their careers as what could best be described as software architecture assistants. They never touched a brick in their lives, to go with the architecture image.


I have junior and senior students that struggle with fizzbuzz... But damn, are they not allowed to even do a lazy inefficient `sort(mylist)[-2]` if they forgot about for loops? That's the most efficient in terms of number of characters, right haha

But I still think you can reasonably weed these people out without these whiteboard problems. For exactly the same reasons engineers and scientists can. And let's be honest, for the most part, your resume should really be GitHub. I know so much more about a person by walking through their GitHub than by their resume.


Using GitHub is discriminatory against people who don’t code on the weekends outside of their jobs, and most people’s job related code would be under NDA and not postable on Github.

To be a capital E Engineer you have to pass a licensing exam. This filter obviously is not going to catch everything but it does raise the bar a little bit.

—-

As far as the root question goes, they are allowed to propose that, and then i can try and tease out of them why they think that is the best and if something is better. But you would be surprised at the creative ways people manage to not iterate through a full loop once.


You're right. But a lot of people that are good coders code for fun. But you're also right that not all those people push their code into public repositories. The same is true for mechanical engineers. They're almost always makers. Fixing stuff at home or doing projects for fun. Not always, but there's a strong correlation.

But getting people to explain projects they did and challenges they faced can still be done. We do it with people who have worked on classified stuff all the time. If you're an expert it's hard for people to bullshit you about your domain expertise. Leet code is no different. It doesn't test if you really know the stuff, it tests how well you can memorize and do work that is marginally beneficial in order to make your boss happy. Maybe that's what you want. But it won't get you the best engineers.


Leet code, in the interviews that I do, is not the only thing I do.

But when I am asked to do a one hour part of an interview for four interview loops a week, all the preps and debriefings, and also do all my normal day-to-day deliverables, we need some fast filters for the obvious bullshitters. The interviewing volume is very high and there is a lot of noise.


Hiring people who code at all is discrimination against people who played video games instead of learning to code.


“Codes on their spare time” is not part of the job description, but “codes at all” is.

There are plenty of reasons not to code on spare time. If anything the people who are most likely to do that are often also the people who coding interviews are supposed to be privileging, fresh single college grads.

I don’t know how people would square the statements “take-home assignments are unpaid labor and unfair to people with time commitments” and then do a 180 and say “people should have an up-to-date fresh github that they work on in their spare time.”


If it would take the candidate "spending every waking moment of their lives coding" to have one or two small coding projects after a half decade plus in the field, that's a signal.

If you went to college but never made anything, that's a signal.

If you didn't go to college, and never made anything, just shut up and make something.


In a half decade plus some people pick up other commitments that are not side projects, like a pet, a child, sports, hiking, etc.

At the end of the day, it isn’t really relevant to the employer what is done in spare time off the job when they get hired, so it’s not like I should privilege their spare time projects over people who don’t do that, particularly if people don’t want to code off the clock. There are plenty of good engineers who do not code off the clock, and there are plenty of bad engineers who do.

Also, more often than not, coding off the clock for the sake of having a portfolio is not really indicative of anything. There aren’t, for example, review processes or anything like that in a one person side project, and unless I spend significantly more hours background checking who’s to say the side project is not plagiarized? People already lie on their resumes today.


In the time you took writing this comment you could've gotten the repo created and the title and description filled out. Writing text in a public readme.md would serve you better than sending it to me.


I have side projects, but I don't expect every candidate to, nor do I expect every candidate to be a religious reader of every HN comment thread.


I'm not saying it should be mandatory, but they would have to show mastery some other way. Whiteboard? Live coding? Project?

I think a side project opens up the opportunity to skip that for a project presentation. This is a lot more in line with real life as you would typically code and later present that work to others. You would defend it to some degree, why you made choice A vs choice B. If you created it, you'll be able to do that.

Doesn't need to be a huge thing. Just show you can do anything at all really at the junior level. Intermediates can show mastery in a framework with something with slightly more complexity than a "hello world".


> I'm not saying it should be mandatory, but they would have to show mastery some other way. Whiteboard? Live coding? Project?

godelski said your resume should really be GitHub. You could have said this instead of sarcasm.


Typically the people without side projects also make excuses to not do those either.

If I had a company I'd offer looking over an existing project or a project where you create a side project of your choice without any further direction.

So not mandatory but the easiest way to go probably. Once you apply to my company you'll have one to show for next time at least.

(If you want to write the project out on the whiteboard instead I guess feel free, that seems hard though.)


Many people do not have side projects. Few people working as software engineers were not tested in some way.

I think it's more useful and more fair to give candidates some direction when I request something. What scope is enough? What tests are enough? We define side project differently or you would expect much more time from candidates than I do.


> What scope is enough? What tests are enough?

How each candidate answers each question themselves tells you about them.


I used to think so. But real tasks have acceptance criteria. Seeing how candidates work with loose criteria has told me more than telling them in effect to read my mind.


It's so clear that people on this site have no friends, family, or hobbies. I also don't think you realize how ableist your comment is.

Some people have more trouble completing tasks for reasons that have nothing to do with talent or intelligence.


this meaning no 1) is the right answer


> Why is the software engineering interview wildly different from the traditional engineering interview

One angle is that SWE is one of the very few professions where you don't need a formal degree to have a career. It's also a common hobby among a sizable population.

I think this is truly great. A holdout breathing hole where people can have lucrative careers without convincing and paying off a ton of gatekeepers!

But I also think that when you hire in other industries, you can get much more milage from looking at the candidate's formal degrees and certifications.

In our industry, you kinda have to start from scratch with every person.


> But I also think that when you hire in other industries, you can get much more milage from looking at the candidate's formal degrees and certifications.

> In our industry, you kinda have to start from scratch with every person.

Not really - in software people leave a bigger and more easily trackable track record than any other engineering field. From previous work projects/experience to open source projects/experience, from personal projects to the communities a person belongs to. A lot of stuff is directly visible on the Internet. In other engineering fields, you have to trust what the applicant says in his or her resume and maybe at most you can call the previous companies he worked at for reference. In software, a lot of the trail is online or easy to tell, and you can still call.

Even for totally new graduates, it is better in software: Its much easier for a software undergrad to work part-time or in a hobby project or contribute to open source and produce something before he or she graduates, so that you can assess his skills. Its much harder for a mechanical or civil engineer to do that, so for that reason you have to rely solely on the relevant university/college and the grades of the candidate.


> Not really - in software people leave a bigger and more easily trackable track record than any other engineering field. From previous work projects/experience to open source projects/experience, from personal projects to the communities a person belongs to.

That only apply to software people who either (a) are getting paid to work on open source or (b) have enough spare time to work on open source as a hobby after hours. Option (b), in particular, usually implies having no children or other familial responsibilities.


And b also implies programming is also a hobby. For a lot of people it is their work. We can not filter on that as not many will be left to hire.


Nnno. You start from their experience and give the benefit of the doubt. As someone who’s been in software for 12 years, I don’t want to talk about writing algorithms. I want to talk about how to motivate 150 engineers to actually write code correctly or inspire a technical initiative.


To have good comparison/calibration between candidates, you should be asking the same question each time, so it can't be about the "problem the team is currently undergoing", because that's going to be something different every week/month.

In general however, of course, there is/should be a round of interview that covers architecture/system design. It's just that the coding interview is a different interview type, which gives a different kind of signal, which is still important. It doesn't replace architecture interview, it complements it.


> because that's going to be something different every week/month.

Why's that a problem? What you're going to be doing on the job is going to change at the exact same rate. But people also tend to talk about recent problems and those may be even a month old. Honestly, the questions are about seeing how the person would approach it. It is not about solving them, because you're going to be doing things you don't know the answers to beforehand anyways.

> It's just that the coding interview is a different interview type

For what reason? "Because"?


> Why's that a problem?

The first half of the sentence you're responding to answers this question already. Because you can't compare candidates fairly if you ask everyone a different question. Is a candidate who aced an easy question better or worse than a candidate who struggled with a difficult question?

> For what reason? "Because"?

What are you asking? Why is an interview where you ask about high level design different from an interview where you ask to write code? Isn't that like asking why an apple is different from an orange? They just are, by definition.


Mechanical engineering interviews seem to do the same as software: "Engineers always ask about beam bending, stress strain curves, and conservation of work. Know the theory and any technical questions are easy."

Basically an equivalent of simple algorithmic questions. Not "real" because it's impossible to share enough context of a real problem in an interview to make it practical. Short, testing principles, but most importantly basic thinking and problem solving facilities.


> Mechanical engineering interviews seem to do the same as software:

I've been an engineer in the past (physics undergrad -> aerospace job -> grad school/ml). I have never seen or heard of an engineer being expected to solve math equations on a whiteboard during an interview. It is expected that you already know these things. Honestly, it is expected that you have a reference to these equations and you'll have memorized what you do most.

As an example, I got a call when I was finishing my undergrad for a job from Raytheon. I was supposedly the only undergrad being interviewed but first interview was a phone interview. I got asked an optics question and I said to the interviewer "you mind if I grab my book? I have it right next to me and I bookmarked that equation thinking you might ask and I'm blanking on the coefficients (explain form of equation while opening book)". He was super cool with that and at the end of the interview said I was on his short list.

I see no problem with this method. We live in the age of the internet. You shouldn't be memorizing a bunch of stuff purposefully, you should be memorizing by accident (aka through routine usage). You should know the abstractions and core concepts but the details are not worth knowing off the top of your head (obviously you should have known at some point) unless you are actively using them.


I've had a coding interview (screen, not whiteboard) fail where the main criticism was that one routine detail I took a while to get right could have been googled faster. In hindsight I still doubt that, given all the semi-related tangents you end up following from Google, but that was their expectation, look up the right piece of example code and recognize the missing bit (or get out right immediately).

For a proper engineering question (as in not software), I'd expect the expected answer to be naming the reference book where you'd look up the formula. Last thing you want is someone overconfident in their from memory version of physics.


> Last thing you want is someone overconfident in their from memory version of physics.

Honestly, having been in both worlds, there's not too much of a difference. Physics is harder but coding you got more things to juggle in your brain. So I really do not think it is an issue to offload infrequent "equations"[0] to a book/google/whatever.

[0] And equations could be taken out of quotes considering that math and code are the same thing.


I had a senior engineer chastise me once for NOT using the lookup tables.

"How do you know your memory was infallible at that moment? Would you stake other people's lives on that memory?"

So what you did on that phone interview was probably the biggest green-flag they'd seen all day.


We live in the age of ChatGPT. It might actually be time to assess how candidates use it during interviews. What prompts they write, how they refine their prompts, how they use the answers, whether they take them at face value, etc.


Sure, and we live in the age of calculators. Just because we have calculators doesn't mean we should ban them on math tests. It means you adapt and test for the more important stuff. You remove the rote mundane aspect and focus on the abstract and nuance.

You still can't get GPT to understand and give nuanced responses without significant prompt engineering (usually requiring someone that understands said nuance of the specific problem). So... I'm not concerned. If you're getting GPT to pass your interviews, then you should change your interviews. LLMs are useful tools, but compression machines aren't nuanced thinking machines, even if they can mascaraed as such in fun examples.

Essentially ask yourself this: why in my example was the engineer not only okay with me grabbing my book but happy? Understand that and you'll understand my point.

Edit: I see you're the founder of Archipelago AI. I happen to be an ML researcher. We both know that there's lots of snakeoil in this field. Are you telling me you can't frequently sniff that out? Rabbit? Devon? Humane Pin? I have receipts for calling several of these out at launch. (I haven't looked more than your profile, should I look at your company?)


I'm actually not talking about interviewees (ab)using ChatGPT to pass interviews and interviewers trying to catch that or work around that. I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

> I see you're the founder of Archipelago AI.

I don't know where you got that from, but I'm not.


> I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

The same way? I guess I'm confused why this is any different. You ask them? Assuming you have expertise in this, then you do that. If you don't, you ask them to maybe demonstrate it and try to listen while they explain. I'll give you a strong hint here: people that know their shit talk about nuance. They might be shy and not give it to you right away or might think they're "showing off" or something else, but it is not too hard to get experts to excitedly talk about things they're experts in. Look for that.

> I don't know where you got that from, but I'm not.

Ops, somehow I clicked esafak's profile instead. My bad


You might as well ask how they use book libraries and web search.


I'm a chemist by education, so all my college friends are chemists.

Being asked a theoretical chemistry question at a job interview would be...odd.

You can be asked about your proficiency with some lab equipment, your experience with various procedures and what not.

But the very thought of being asked theoretical questions is beyond ridiculous.


Why, don't they get imposters? You sure run into people who can't code in coding interviews.


Because to be a chemist you need to graduate in chemistry.

What would be the point of asking theoretical questions?

There's just no way in hell people can remember even 10% of what they studied in college, book knowledge isn't really the goal, rather than teaching you how to learn and master the topics.


Because to actually have those types of conversations you have to have legitimate experience. To be a bit flippant, here's a relevant xkcd[0]. To be less so, "in groups" are pretty good at detecting others in their groups. I mean can you not talk to another <insert anything where you have domain expertise, including hobbies> and not figure out who's also a domain expert? It's because people "in-group" understand nuance of the subject matter.

[0] https://xkcd.com/451/


Doesn’t that comic more closely hew to the idea that some fields are complete bullshit?


That's one interpretation. But that interpretation is still dependent upon intra-group recognition. The joke relies on the intra-group recognition __being__ the act of bullshitting.


Hmm… I have a twist on this. Chemistry is a really big field.

My degree is in computational/theoretical chemistry. Even before I went into software engineering, it would have been really odd for me to be asked questions about wet chemistry.

Admittedly it would have been odd to be quizzed on theory out of the blue as well.

What would not have been odd was to give a job talk and be asked questions based on that talk; in my case this would have included aspects of theory relevant to the simulation work and analysis I presented.


And software and computing isn’t a big field? Ever heard of EE?


Half a dozen years ago in a conference talk, Joel Spolsky claimed credit for inventing these sorts of whiteboard interviews (with his Guerilla Guide to Interviewing), and that it had broken software engineering hiring.

https://thenewstack.io/joel-spolsky-on-stack-overflow-inclus...


FTA:

“I think you need a better system, and I think it’s probably going to be more like an apprenticeship or an internship, where you bring people on with a much easier filter at the beginning. And you hire them kind of on an experimental basis or on a training basis, and then you have to sort of see what they can do in the first month or two.”

Well, if he fucked it up, I don’t see any reason why his ideas can’t also fix it.


Unfortunately this only works for interns and new grads. Nobody experienced want to take a job on an experimental basis.


Fortunately people with experience have resumes and are easier to tell if they're bsing their resume.

Fuck man, people do this with engineers who work on classified projects. You all are over thinking it. You're trying to hyper optimize for a function that is incredibly noisy and does not have optimal solutions.


And how would it scale to a number of candidates greater than one? A classroom full of competing peers? That's what talent shows are for.


Aren't probationary periods pretty standard, in many/all industries and countries too not just software?


Yes and such a system makes hiring so much easier because mistakes cost much less. But the US ties things like healthcare to employment so a company that has a reputation for firing people after hiring them (however legitimate) would probably be one people would avoid. In Sweden, for example, I’ve found interviews so much more reasonable. Then again, I had healthcare there regardless of employment.


Oh, I see. I'm in the UK, so more like Sweden; no experience of having healthcare tied to employment (other than optional private healthcare as a perk).


> Why is the software engineering interview wildly different from the traditional engineering interview

I am guessing here, but wouldn't a candidate for a traditional engineering role normally hold a college degree in a relevant field, so that part of quality assurance is expected to have been done by the college?


Candidates for software engineering roles normally hold a degree in a relevant field.


> if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

Being able to evaluate a person is a difficult soft skill to learn. An interviewer cannot learn nor improve it over night nor months nor years. This is basically being good at reading a person. Not to mention an issue with bias that is highly subjective.

If an interviewer isn't good at this, the solution would still be to supplement your evaluation with a coding interview.


I've only ever had a single whiteboard interview in my career, and it was a single interviewer who preferred them (I accepted the job), but I have also walked through the backdoor via recommendations for all but 1 of my employers in ~20 years in the industry. From embedded in radio and television broadcasting, to medical robotics, to AAA games, with some excursions into web development. Every other interview at a company I accepted an offer for was a conversation with engineers about my experience and some hypotheticals.


If you talk shop with a mechanic, they're going to know pretty quickly if you actually know what you're talking about. In my experience, the same applies in our field.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: