Hacker News new | past | comments | ask | show | jobs | submit | ggorlen's comments login

I wrote a similar post on in-browser scraping: https://serpapi.com/blog/dynamic-scraping-without-libraries/

My approach is a step or two more automated (optionally using a userscript and a backend) and runs in the console on the site under automation rather than cross-origin, as shown in OP.

In addition to being simple for one-off scripts and avoiding the learning curve of a Selenium, Playwright or Puppeteer, scraping in-browser avoids a good deal of potential bot detection issues, and is useful for constant polling a site to wait for something to happen (for example, a specific message or article to appear).

You can still use a backend and write to file, trigger an email or SMS, etc. Just have your userscript make requests to a server you're running.


As is often the case, the title is unfortunately overloaded. I initially read this as writing code in the Scratch programming language[1] that compiles to assembly.

[1]: https://en.wikipedia.org/wiki/Scratch_(programming_language)


Given that we've seen things like C compilers written in Bash[1] and TLS libraries in Visual Basic[2], I think it's only a matter of time before someone with the right skills and motivation actually does it (or writes a compiler for $language in Scratch).

[1] https://github.com/otakuto/bashcc

[2] https://news.ycombinator.com/item?id=35882985


Yes, capitalisation of scratch was incredibly misleading in this case. It's easily checkable but slightly annoying. Hard to avoid though!


Even thought the S in Scratch is uppered, I still read it correctly.


Ironically the word "overloaded" is overloaded too.


Me also. The actual thing is probably pretty cool, but now that my hopes were up I'm finding it insufficiently whimsical.


I also misunderstood it in that manner.


CCSF alum here (AA, completed over 20 CS courses). I attended a similar CCSF job fair in 2017 and there were only 2-3 companies involved even then. If I recall, it was Lawrence Livermore National Lab and Mission Bit, maybe a third that must not have been interesting enough to stay in my memory. There were ~10 students at the fair.

The LLNL internship seemed too IT-oriented to interest me. Mission Bit involved teaching high school kids to code after school, which I did for over a year and received course credit at CCSF for.

So it's not that surprising to me that they went from ~2 companies to 0, even disregarding the major drop in attendance at CCSF since 2017 [1]. I didn't attend any other fairs so maybe I missed something, but I never got the impression that they were especially "bumping".

[1]: https://sfstandard.com/2022/01/27/by-the-numbers-sf-higher-e...


> Consider that taking 5-10 graduate courses and writing a master's level thesis or project will generally take all of your free time and a bunch of your savings over the course of two or more years. (I sure hope you're not thinking to take on debt for this!)

OMSCS graduate. The program certainly ate up most of my free time for 2.5 years, but on the other hand, the whole degree was about $8k for me and required no thesis or capstone project--just grinding through 10 classes worth of assignments and exams. Also, it was 100% online, so that flexibility frees up time.

Theoretically, if you do 1-2 easy-ish classes per semester, you can minimize the free time impact. But I was less interested in the credential and more interested in the learning experience, so I took difficult classes and worked as a TA.

Caveat: I graduated in 2021 so things may have changed since then.


It's the same now. You can still take two hard/challenging courses together if you plan it right, though. Some courses release all the projects at the start of the semester so they're more self-paced, pair that with courses on a more rigid schedule that release one project at a time. If you stay ahead in the one class, then it's not much different than taking one at a time.


Good to know. https://www.omscentral.com/ and talking to other students helped a great deal with planning.

Two difficult classes I took together were Embedded Systems Optimization and Compilers, both taught by the same instructor and with similar concepts, so working on one helped solidify concepts in the other.

On the other hand, I took Distributed Computing during its first offering alongside Graduate Algorithms and was super overwhelmed.


Wow, I thought I was masochistic. What possessed you to take DC and GA (two of the hardest classes) together?

I took DC alone and found it manageable if you weren't a perfectionist. Compilers was the most difficult thing I've ever done, though that was mostly due to my own poor time management during phase 3 (generating the intermediate representation). I didn't complete phase 3, so most of my time during phase 4 (emitting MIPS assembly + implementing three register allocation algorithms + optimizations) was catching up.

Also, as a note, the very difficult classes like distributed computing and compilers are completely optional (though well worth it IMO). The only very difficult class that's required is graduate algorithms.

For those not familiar with OMSCS, there are some class ratings here: https://www.omscentral.com/


It was the first online cohort of DC so I didn't realize how intensely difficult it'd be. I'd taken GIOS with Ada which was gentle, so I figured DC wouldn't be significantly harder.

It was a good life experience even though I wound up sacrificing my 4.0 by a few grade points. In hindsight, had I realized how steep the curve would be for DC, I'd have pushed a bit harder to squeeze out a few more test cases, but I was pretty mentally defeated at the time and felt like I'd exhausted all of the ideas I had on the projects multiple times over.

I found DC more difficult than compilers by a wide margin because of the nondeterminism, debugging difficulty and trying to figure out what the test harness was even doing. Compilers involved writing more lines of code, but it was manageable, synchronous greenfield application design.


You don't have to take GA anymore if you choose the new specialization.

https://omscs.gatech.edu/specialization-human-computer-inter...


Neither HCI nor Interactive Intelligence require GA. In II it's one of two courses courses you get to choose between. I think a lot of people select those two specializations just to avoid it. II is also close to the ML specialization, so people who have trouble with or want to skip GA can move to it pretty easily.


Hm what's your evaluation of the program then? What did you learn, and did you learn what you wanted to learn?


I graduated this year. Great program at an incredible price, I think it was less than $7k for me after they reduced the fee. I didn't have a CS undergrad (Other stem) but was working as a software engineer and wanted to bulk up on basic knowledge with the systems track. Learned a lot of what I wanted and got sight on some weak areas. Would highly recommend the program. Hardest part was that it's actually very rigorous if you're going for As.


>OMSCS.

From the Georgia Tech online course?


I googled and answered my own question:

https://omscs.gatech.edu/


Related Wikipedia articles:

- https://en.wikipedia.org/wiki/Kodokushi, the Japanese phenomenon of people dying alone and remaining undiscovered for a long period of time

- https://en.wikipedia.org/wiki/Joyce_Vincent, a British woman who had been dead over two years before discovery


Reminds me of Boyd Rice's rotoguitar (a fan attached to a guitar) and the extratone genre (extreme techno kick drums that run so fast they turn into pitched tones).


There wouldn't be much industry if everyone who trusted ChatGPT and other ways of quickly getting code up (copy-pasting Stack Overflow, "try random stuff until it works" debugging, hopping on calls with random freelancers, etc) followed your advice.

Many programmers I've encountered in early stage tech startups (and in general) are not craftspeople--they're scrambling to get a product to market as quickly as possible and quality and process are very much secondary. Many are working in unfamiliar languages by necessity, or are relatively new or even untrained as professional programmers. I mentor such folk regularly. (Actually, these untrained hackers are often "better" at programming in many respects than senior engineers with 10 years of experience, but that's another story).

If the company survives long enough, they might pay off the tech debt later. OP's team just got unlucky doing the same strategy many other startups are doing nowadays and are willing to admit it.

To be clear, I'm not excusing the mistake or endorsing the process they followed, only noting that their actions aren't out of the ordinary (other than admitting to the mistake) and empathy is due.


Warning: this page loads 234 MB of data! Images are up to 7 MB each.

https://pagespeed.web.dev/analysis/https-vickiboykis-com-202...


"Does your page design improve when you replace every image with William Howard Taft?

If so, then, maybe all those images aren’t adding a lot to your article. At the very least, leave Taft there! You just admitted it looks better."

https://idlewords.com/talks/website_obesity.htm


> Please don't complain about tangential annoyances

https://news.ycombinator.com/newsguidelines.html


I found the images distracting, and in particular the tweet-sized chunks of text separated by picture or two made the text hard to follow.

So I think it's a legitimate criticism as this is a site for technical folks. Sure, not the topic of the post being commented on, but a good example to discuss.


Looks like it's a presentation transcription.


I find it useful. I'm on a pretty limited mobile data plan. Imagine others might be too. Though I wish there was a way my browser could warn me.


Please let Dang do the moderating. I think he gets paid by the flag.


Thanks for the reminder, but anyone visiting this site on a restricted data plan may be in for a surprise. Websites that download a quarter gigabyte of data without warning is more than an annoyance.

How do you suggest I warn others to avoid this site? I don't think it's suitable for HN in its current state.


This is not a tangential annoyance, it’s akin to a NSFW warning.


Author here. That's definitely my bad and not an intended user experience. The text was initially meant as a transcript accompanying presentation slides. I've compressed the images so it should be at least slightly better now.


It's 2024. So what :P

The article itself is a confusing jumble of fiction, religion and tech though. I don't grok it.


https://fav.farm seems like a cool service at first, but it doesn't actually do a whole lot. I'd just assume inline the one-liner they mention (similar to https://css-tricks.com/emoji-as-a-favicon/) and avoid the external dependency for the price of 117 characters.


Zeni Geva deserves more credit, great to see them mentioned. You probably know Neurosis and Melt Banana but those might also be up your alley.

Other noteworthy Albini-engineered bands for me: Oxbow, Low, Whitehouse, Labradford, Burning Witch, Godspeed's U.X.O album. Peter Sotos' Buyer's Market is one of the most insane albums I've ever had the displeasure to listen to, and Albini apparently produced it, although I don't hear any obvious influence.


The Albini-produced Whitehouse albums seem to be underloved because they're a weird transitional period between the group's more famous "wasps fighting" and "broken African robot" sounds, but they're some of the best releases. There's a decadent and sinister competence to them that can't be heard in the earlier buzzier material.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: