Hacker Newsnew | past | comments | ask | show | jobs | submit | thecodemonkey's commentslogin

There's a couple of great open source projects[1][2][3] that try to keep up-to-date lists of domains that belong to disposable email providers.

I would probably not recommend implementing a whitelist for blocking purposes. But perhaps domains on a whitelist could get a slight scoring bump.

[1] https://github.com/disposable-email-domains/disposable-email... [2] https://github.com/disposable/disposable [3] https://github.com/unkn0w/disposable-email-domain-list


As for abuse, I made myself a tool to give myself quintillions of email addresses (not using plus addressing) on gmail.com

I use this to sign up for a service with a unique email that is basically my junk box, but the email is its own unique entry in my password manager


Thanks for giving it a read!


We are mainly B2B so we don't really see signups using Apple's email relay. That said, it could be something we might have to consider blocking in the future if it becomes a problem.

For paying customers, it probably doesn't make a lot of sense to use an anonymous email address, since we ask for your name and billing address either way (have to stay compliant with sales taxes!)


Isn't it nice to have just a little bit of an illustration instead of just text? Obviously an AI-generated image is going to spit out some nonsense text as part of the graphic, but we're not really trying to hide that it's AI generated.


I think things that require high credibility and have a learned readerbase it'd be better to not give a careless image, even at the cost of a cool image. I wouldn't mind an almost right image on some advert for cleanex or intranet holiday reminder mail, but I would be very concerned if it was used as part of EU directive


I would love do a more in-depth talk about this at some point with some more concrete examples.


Not at this time. Some simple heuristics go a long way and also makes it very easy to test and debug the logic.


I’ve seen fraud detection used in a SaaS product, and the great thing about a weighted rules approach, is professional services can understand it well enough to adjust it without help from engineering or data science, and they can explain to customers how it produced the results it did in a particular case, and the tradeoffs of adjusting the weights or thresholds, and the customers can understand it too. Whereas, a machine learning model, is much harder to understand and adjust, so issues are much more likely to be escalated back to engineering.

(This isn’t protecting the SaaS vendor against abusive signups, it is a feature of the SaaS product to help its customers detect fraud committed against themselves within the SaaS product’s scope.)


I once did a machine learning project at Intel. The end result was that it was no better than simple statistics; but the statistics were easier to understand and explain.

I realized the machine learning project was a "solution in search of a problem," and left.


Career hack: skip the machine learning and implement the simple statistics, then call it machine learning and refuse to explain it.


statistical regression is also machine learning.


hack v2: call it AI


Really excited for this! Had the opportunity to take part in early access over the last few weeks and the deployment process has been super smooth and insanely fast.

I'm mostly just impressed with how polished everything feels and how easy it was to add database, key/value store, etc.

Currently using Laravel Vapor for most of my hosting needs, but will be switching everything over to Cloud.


Why would the deployment process be faster than anything else?

Why Laravel cloud over anything else?


What is uv?


It’s a package manager for Python (among other things) that is simultaneously correct and performant. It’s getting a lot of attention these days.

I highly recommend it over all conceivable alternatives.


How likely is this to be another npm/yarn situation? I have used a number of Python tools that in the long run turned out to be less portable and less usable than plain Pip and virtualenv. I do use pyenv for a personal setup but not for any production use.


You can use uv and get the benefits today: it combines the functionality from several tools (including pyenv) and the performance is such that it is transformative.


So it is in fact a yarn/npm situation: an alternative tool that does too many things and aims to replace a bunch of other tools.

I read the same sentiment about poetry a few years ago. Turns out tools that do one thing well are better in the long run.


as if it was so simple. don't forget about pnpm.

it's the same for uv/pip/poetry except uv is so much better than the alternatives there isn't any contest. pip is the safe default (which doesn't even work out of the box on debian derivatives, which is half the issue) and uv is... just default.


As a Debian user that wanted to use a cli tool only available from npm it was horrible trying to find some sane instructions that didn't assume intimate knowledge of node package structures.

I did eventually figure out I could just do `corepack pnpm setup` then install packages globally with pnpm.



Seems that it interprets a special comment block, then automatically creates a python venv and gathers dependencies. As the post[1] says, it obviates "faffing" with venv and pip to run a script.

https://akrabat.com/defining-python-dependencies-at-the-top-...


Our timezone and Highway data is OSM-based, but that's pretty much it.

Many of our address sources come from the OpenAddresses project: https://github.com/openaddresses/openaddresses


Oh wow. This reminds me of Netlify pre-funding. All the at down to the <input type=“file” /> on the landing page


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: