Hacker News new | past | comments | ask | show | jobs | submit | wsintra2022's comments login

Should we be giving our children’s education over to poison?


Press on Vinyl are able to make there own masters. Can you clarify what you mean?


I'm kind of vague on the whole lacquer -> father -> mother -> stamper -> record process, and I don't remember which of these steps has supposedly dwindled down to just a single company due to the fire. Googling isn't helping.

Normally "masters" are lacquers, though, so a company that makes their own "masters" still probably outsources the steps after the lacquer and before stamping finished records.


I can confirm, Press On Vinyl makes their own stampers, originally they sent the masters to London and got stampers back but in the last year or so they upgraded the factory and they now do it all in house. Including a mastering studio, if anyone wants to press vinyl please check out kushtybuckrecords.com or pressonvinyl.com (I am the developer of kushtybuckrecords)


What fire?



There's some detail in the Apollo Masters wikipedia page: https://en.wikipedia.org/wiki/Apollo_Masters_Corporation_fir...


Thank you! I clearly didn't have a very good grasp of the situation, and this is definitely what I was thinking of, and it's much less serious than I had thought.

> With the destruction of Apollo Masters, producing lacquer discs fell solely to a small Japanese company, MDC. This caused a significant strain on the industry, as Apollo Masters was responsible for 70-85% of lacquer production.[4][5] As a result of the strain, orders became backlogged and delayed, (...)

> With a decrease in lacquer production, some vinyl pressing plants turned to alternate means of producing records, including direct metal mastering, a costly method of producing records with copper instead of lacquer.


Swan's Death Records could make their own masters too (https://en.wikipedia.org/wiki/Phantom_of_the_Paradise, sorry for being kinda off-topic)


Like USB vs lightning? People are compassionate about people or they not.


I’m very confused, is lightning compassionate?


Haha I’m trying to understand the analogy too. I think usb is compassionate though, since it’s the more universally adopted of the two standards.


Oh wow, love it, a testacle in a teacup, dudes just made my day!


> Oh wow, love it, a testacle in a teacup, dudes just made my day!

It's an eggcup.


Well now we're just splitting pubes.


No, that's the science project being worked on at cern, the large hardon collider


Flubber


Social media posts are not really writing though. You do not write down detailed thoughts on social media it’s often more likely brain farts than anything.


Valid point, wonder why it was down voted. I was thinking that question myself, yes write down your thoughts in a journal digital or analog and you can always refer back to that time and place but what benefit do you get by giving it away for all to ignore or consume?


Is it cheating if I can solve the problem using the tools of AI, or is it just solving the problem?


Interviews aren’t about solving problems. The interviewer isn’t interested in a problem’s solution, they’re interested in seeing how you get to the answer. They’re about trying to find out if you’ll be a good hire, which notably includes whether you’re willing and interested in spending effort learning. They already know how to use AI, they don’t need you for that. They want to know that you’ll contribute to the team. Wanting to use AI probably sends the wrong message, and is more likely to get you left out of the next round of interviews than it is to get you called back.

Imagine you need to hire some people, and think about what you’d want. That’ll answer your question. Do you want people who don’t know but think AI will solve the problems, or do you want people who are capable of thinking through it and coming up with new solutions, or of knowing when and why the AI answer won’t work?


> They’re about trying to find out if you’ll be a good hire, which notably includes whether you’re willing and interested in spending effort learning

I admire this worldview, and wish for it to be true, but I can't help but see it in conflict with much of what floats around these parts.

There's a recent thread on Aider where the authors' proudly proclaim that ~80% of code is written by Aider itself.

I've no idea what to make of the general state of the programming profession at all at the moment, but I can't help but feel learning various programming trivia has a lower return on investment than ever.

I get learning the business and domain and etc, but it seems like we're in a fast race to the bottom where the focus is on making programmers' skills as redundant as possible as soon as possible.


>I admire this worldview, and wish for it to be true, but I can't help but see it in conflict with much of what floats around these parts.

Honest interviewers may not realize how dishonest other interviewers became in such recent times (2-3 years ago). Interviewing today compared to COVID times is night and day. Let alone the 10's Gold Rush.

The respect is long gone.


> Interviews aren’t about solving problems.

Eh, I wish more people felt that way, I have failed so many interviews because I haven't solved the coding problem in time.

The feedback has always been something along the lines of "great at communicating your thoughts, discussing trade-offs, having a good back and forth" but "yeah, ultimately really wanted to see if you could pass all the unit tests."

Even in interview panels I've personally been a part of, one of the things we evaluate (heavily) is whether the candidate solved the problem.


Isnt one of the ways of solving the problem using all the tools at your disposal? If at the end of the day, isnt having working code the fundamental goal? I guess you could argue that the code needs to be efficient, stable, and secure. But if you could use "AI" to get part way there, then use smarts to finish it off. Isnt that reasonable? (Devils advocate) The other big question is the legality of using code from an AI in a final commercial product.


Yes that’s a fair question. Some companies do allow LLMs in interviews and on the job. But again the solution isn’t what the interviewer wants, so relying on an LLM gives them no signal about your intrinsic capabilities.

Keep in mind that the amount of time you spend in a real job solving clear and easy interview style problems that an LLM can answer is tiny to none. Jobs are most often about juggling priorities and working with other people and under changing conditions, stuff Claude and ChatGPT can’t really help you with. Your personality is way more important to your job success than your GPT skills, and that’s what interviewers want to see… your personality & behavior when you don’t know the right answer, not ChatGPT’s personality.


Yeah everyone says that they are interested in how you got there but this isn’t true in reality from my experience. Your bias inevitably judges them on the solution because you have many other candidates who got the correct solution.


You’re right, interviewers will still care about whether you come up with a solution, and they care about the quality of the solution. The part you might be missing is that what I said and what you said aren’t mutually exclusive; they are both true. Interviewers do have to compare you to other candidates, and they are looking for the candidates that stand out. They want more than a binary yes/no signal, if at all possible. What I was trying to say is that the interviewer doesn’t need the solution to the problem they ask you to solve, what they need is to see how well you can solve it. I hope that’s stating the obvious, but it’s worth really letting it sink in. It’s super common for early-career programmers to be afraid of interviews and complain about them. Things change once you start doing the interviewing and see how the process works.


If you've been given the problem of "without using AI, answer this question", and you use an AI, you haven't solved the problem.

The ultimate question that an interview is trying to answer is not "can this person solve this equation I gave them?", it's usually something along the lines of "does this person exhibit characteristics of a trustworthy and effective employee?". Using AI when you've been asked not to is an automatic failure of trust.

This isn't new or unique to AI, either. Before AI people would sometimes try to look up answers on Google. People will write research papers by looking up information on Wikipedia. And none of those things are wrong, as long as they're done honestly and up front.


If you are pretending to have knowledge and skills you don't have you are cheating. And if you have the required knowledge and skill AI is a hindrance, not a help. You can solve the problem easily without it. So "is using ai cheating"? IDK, but logically you wouldn't use AI unless you were cheating.


Knowledge and skill are two different things. Sometimes interviewers test that you know how to do something, when in practice it's irrelevant if you A) know how to retrieve that knowledge and B) know when to retrieve it.


There is foundational knowledge you must have memorized through a combination of education and experience to be a software developer. The standard must be higher than "can use google and cut and paste." The answer can't always be - "I don't need to be able to recall that on command, I can google/chatgpt that when I end up needing it." Would you go to a surgeon who says "I don't need to know exactly where the spleen is, I can simply google it during surgery."


For the goal of the interview - showing your knowledge and skills - you are failing miserably. People know what LLMs can do, the interview is about you.


I guess its more of a question if you can solve the problem without AI.

In most interview tasks you are not solving the task “with” ai.

Its AI who solves the task while you watch it do it.


I’m using openwebui, can this replace ollama in my setup?


The article has caused a rukus. We need to be infinitely more furious that there are children without books.

If you didn’t read. Children are being captured by phones and tablets. Help them escape.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: