Hacker Newsnew | past | comments | ask | show | jobs | submit | meneer_oke's commentslogin

Just this weekend I had the perfect problem for sqlite, unfortunately 200MB and above it became unwieldy.


I’d like to hear more about this


There is a big difference between acting as if you know something and actually knowing it. You're describing one thing, while the comment you are replying to is describing something else.

Let's consider a developer who is interested in programming and has a lot of experience. If the team is going down a path that the developer knows will lead to issues, or if the developer tries to help because the team is not as strict and it's leading to outages, that developer may be perceived negatively.

However, the developer is able to find security issues and potential bugs during code review, objectively providing significant value to the team.

This developer could be labeled as having a "rock star" attitude, while in reality, they are trying their best to ensure there are no security, performance, or outage issues.


> The most brilliant devs I've ever met were people nobody would call "rock stars"

That sparks my interest. Could you elaborate on what constitutes a brilliant developer from your perspective?


A developer that knows how to treat other people in the team, knows how to be professional, works on what they agreed to during sprint planning, doesn't want to rewrite the project to a new technology and doesn't want to create their own cool project because they know the ultimate goal is to ship to users, doesn't react badly to criticism, is not unreasonably stubborn, has good hygiene, doesn't take code reviews personally and agrees that management has a necessarily different point of view from pure developers.


A brilliant dev is one who consistently produces solid and feasible solutions to difficult problems, in a timely way.

A more enlightening question is "what is a rock star" in my mind. A rock star is a dev who truly believes that they are the smartest person in the room, who looks down their nose at any work they didn't produce, who is only really interested in things that they think make themselves look better rather than things that will actually help produce the best end product possible, who insists on using the whatever latest shiny thing is in fashion over less shiny but more appropriate approaches, and so forth.


The Rock Star will say, “I will do these 10 things by next Friday.” And then next Friday rolls around and 8 of them are done. This repeats week after week, but I can never tell which 8 will get done. They’re boastful and proud that they got a lot done but the rest of the team can never count on them to get any specific thing done.

The other person will say “I will do these 5 things by next Friday.” And then next Friday all 5 are done. And they repeat this week after week after week.

Give me the second person every time.


I agree with your scenario, but I wouldn't consider someone who consistently under-delivers a "rock star."

This is why defining terms is important. For me, a rockstar is a developer who is just further along, who likes to program in their own time, who reads Hacker News, and builds projects on the weekends. Within 5 years, that person will be much further ahead than the rest and will spot pitfalls, security, and performance issues simply because they have encountered them before. However, sharing that information could lead to that person being called a "rockstar" and problematic, hampering "progress".


It doesn't seem faster. Seem would imply that it isn't the case. It is faster currently on that setup.

But since python runtime is written in C, the issue can't be Python vs C.


It's obviously not python vs c -- the time difference turns out to be in kernel code (system call) and not user code at all, and the post explicitly constructs a c program that doesn't have the slowdown by adding a memory offset. It just turns up by default in a comparison of python vs c code because python reads have a memory offset by default (for completely unrelated reasons) and analogous c reads don't by default. In principle you could also construct python code that does see this slowdown, it would just be much less likely to show up at random. So the python vs c comp is a total red herring here, it just happened to be what the author noticed and used as a hook to understand the problem.


C is a very wide target. There are plenty of things that one can do “in C” that no human would ever write. For instance, the C code generated by languages like nim and zig that essentially use C as a sort of IR.


That is true, With C allot of possible

> However, python by default has a small offset when reading memories while lower level language (rust and c)

Yet if the runtime is made with C, then that statement is incorrect.


By going through that line of thought, you could also argue that the slow implementation for the slow version in C and Rust is actually implemented in C, as memcpy is on glibc. Hence, Python being faster than Rust would also mean in this case that Python is faster than C.

The point is not that one language is faster than another. The point is that the default way to implement something in a language ended up being surprisingly faster when compared to other languages in this specific scenario due to a performance issue in the hardware.

In other words: on this specific hardware, the default way to do this in Python is faster than the default way to do this in C and Rust. That can be true, as Python does not use C in the default way, it adds an offset! You can change your implementation in any of those languages to make it faster, in this case by just adding an offset, so it doesn't mean that "Python is faster than C or Rust in general".


The bug is the other way around :)


The degradation of chatgpt4 from being called AGI, into what is now...


Would it be possible to explain why it's big deal.


I don't remember the time that online certificates from MIT, Berkeley, etc would have been anti signals.

https://www.edx.org/learn/computer-programming/massachusetts...


Nobody will fault you for learning programming through an online course. They will, however, mark you down for listing that on your resume as your sole programming experience.


Do you that course that I refer to?

People within companies are all different, so we can't say. Some hiring managers might, others actually prefer no degree, others might focus more "shipped". It depends


To be clear I'm speaking generally, not on an individual basis. Everyone has their own preferences–I personally don't discount it. But in my experience the average hiring manager does, because certifications are often seen as a thing you just pay money for to get a piece of paper. Of course college degrees are like that too, but the quality of "big state school CS degree" is usually somewhat regulated and "big state school 6 month CS certification" is often not.


Please take a look at the course.

The courses from Berkeley, Harvard, Stanford, MIT are quite good. There is no paying, just knowledge.


No, I know about the courses. They're pretty good. I'm just saying that there's baggage attached to certificates from random courses, even if some of them are good.


I would agree, but the topic started with that they are anti signals.


You're still not getting it.

Taking the course is not a negative signal.

Putting the course on your resume, as if that conveys useful information, is.


I think it's clear that you are not getting it, that is just your opinion.

Are you a hiring manager? A you are hiring manager of the whole IT industry?

Even in this thread there are many that have successfully landed great jobs with certificates on their resume.


OK, good luck.


Honestly, a full genuine undergrad CS degree from Harvard is nearly an anti-signal...


Seems even in this thread there are many without any certificate nor formal education. Would it be possible to explain how that is happening?


I have a formal CS education, although I've hired a bunch of people without. I'm just saying, a Harvard CS degree tells me either you're extremely bad with money, or you (or your parents) are more interested in social signalling than a thorough education.

You also seem to be missing that the content associated with the certificate can be fine, but the validation process broken enough that the certificate per se (and therefore listing it on a resume) is worthless.

(I prefer to hire based primarily on work samples or a take-home project.)


Well the context is online courses from Harvard, MIT. Such as the CS50 course referenced by many even in this thread. Those courses are free, there is a option for a certification for around 150 dollar.

People actually learn the basics of CS and foundational knowledge in those courses. And how to use git and GitHub.

I never stated that those certs, or courses alone would be enough. This comment thread started because someone called it anti-signal. Listing anything on a Resume is never pointless, it tells us something about the potential hire.

Courses like the one mentioned are open, which means if you wanted to test that knowledge you could, easily.

That could tell you so much about the potential hire:

* Did they actually do the course.

* Did they retain the information.

* Did they understood the material.

* How did they use the information, in their own projects or clients.

From those answers you would even be able to learn more about their seniority level.

The point is, "anti-signals" is just not everywhere the case. It might be for some, sure. But how many of those would like to see formal CS education?

Seems that even you are more on the side of actual code examples. And in order to create great code, allot of deep knowledge is needed. Which those courses provide.


> Listing anything on a Resume is never pointless

Sorry, but you have absolutely wrong ideas about resumes, how they are read, and what information they convey. Both descriptively and normatively.

> Courses like the one mentioned are open, which means if you wanted to test that knowledge you could, easily.

If I have to test anything, I'm going to test the actual job skills. But actually the point of a degree/certification/whatever on a resume is to show me that someone else already did the testing so I don't have to spend time doing it. Conversely, if I have to test it again, it's pointless to put on a resume.


I've been hiring devs at multiple companies.

It seems you are misunderstanding me again, If someone puts something on resume. It's a signalling they think it's important. And in order asses candidates on their strengths, we use the resume.

If you are worried that they are missing the information from the course, I gave how it's actually great tool to asses candidate.

You also thinking that can outsource in some sense if developer will workout at the work place you are hiring for. This is a mistake.

Instead of making blanket statements, and dismissing people on frivolous things. Like going to ivy league universities or having certificate.

It's better to test the person, to get the know more about the strengths and weaknesses.

A resume is just window into the candidate thinking process.

-edit It's clear from your comments you're not aware of the great work at Harvard with CS50 course.


Are you saying a Harvard CS degree isn’t a thorough education? Or are you assuming that everyone at Harvard pays full freight?

I don’t get the elitism or why you think someone with a degree from a specific school is inherently stupid. That’s as weird as refusing to hire non-formally educated people.


I think you can get a better formal CS education than Harvard for a fraction of the price at two dozen places, probably including at least one state school you have access to.

Harvard is good for signalling, traditional liberal arts, and pre-law (but, I repeat myself). It's not a very good teaching university overall, and especially not in CS.


If you don’t have a high regard for the Harvard CS curriculum, it’s reasonable enough to disqualify people on that basis.

That said, I think you’re completely out of touch with how 18 year olds choose universities. It can be as simple as a friend of theirs is also attending, or the school is close to a parent’s home, or they were actually in a liberal arts program before they switched into CS.

Sometimes as you implied, it’s their parents making that decision for them.

There’s no point in holding the cost of a student’s education over them, especially as you don’t know even how much they paid.


> If you don’t have a high regard for the Harvard CS curriculum, it’s reasonable enough to disqualify people on that basis.

It's not reasonable, those statements are liability for you and the company you represent. Hiring laws are quite strict of discrimination.


Is that discrimination? To prefer candidates from one university over another?


Assuming you mean strictly in the hiring/legal sense, it is absolutely not.


Yeah, sure, spare a tear for all the kids who accidentally ended up at Harvard.


The point of my comment wasn’t “Harvard kids are discriminated against”, the point is that your hiring logic doesn’t make sense.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: