Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I quit my job to make free web tools (nslookup.io)
317 points by pul on Feb 16, 2022 | hide | past | favorite | 104 comments


Someone on Reddit asked how I make a living. Might be interesting for folks here too.

Right now, we're living off our savings and my wife's salary. Both sites are already profitable, though. I've recently launched an API for NsLookup, which now generates around 100 USD / month, and both sites have ads that generate income. Not enough to replace my salary yet, but I'm confident I can increase it over time.

Ads have a bad connotation, mainly because they often track users across the web. Mine don't. For WhoIsMyISP.org, I've partnered with NordVPN, and have their ads with plain `<a href=...><img /></a>`. And for NsLookup.io, I've joined the Carbon network, which also doesn't have tracking.


Good luck, mate! If you bother to verify NsLookup.io on SaaSHub, I can get it on its weekly newsletter. That could help you a bit. (just ping me by email/twitter once done). Cheers!


Thanks for the offer :) I already submitted it a while ago: https://www.saashub.com/nslookup-io-alternatives


I hope the ad images are also hosted on your server, or they still can be tracking.

Not criticism. I applaud your approach. I just point out that running non-tracking ads is harder than you would believe.


Hmm, not that I've taken a second look, I noticed that I added a hidden image hosted on their servers when I implemented the ads. It's used for counting the ad impressions, but it looks like it drops a 1-day valid cookie as well :/


It's a little funny to me you added a literal tracking pixel to your website from a VPN company and then came to HN to preach that you don't do any tracking

That said, due to third party cookie changes, tracking pixels have very limited usefulness these days and it's unlikely it actually tracks people over a full 1 day period


I guess you're right.. And indeed, browsers fight back: https://blog.mozilla.org/security/2021/01/26/supercookie-pro...


Safari started the trend 2-3 years ago


Good luck standing between those tricksy ad-folk and their quarry.


It's called tracking for a reason.


Are you not concerned that you are leaking exactly what domains people are looking up in realtime?

If I used a service like that I would expect some degree of privacy that goes beyond this. It's crazy to me that you're doing this to be honest

https://plausible.io/nslookup.io/pages?period=realtime


There is no privacy issue here, there aren't source IP addresses attached, you can't tell who's looking these up. It's up there with Amazon's (for example) "other people bought" recommendations, you can't tell who did the looking up.

They're just domain names which are effectively public records. And even if you thought you registered a super secret domain name (no such thing), maybe the registrar looked it up, or some bot found it through brute force (if nslookup.io is open to bots).


I do understand all of that, still it feels weird to me. If I'm looking up a staging domain I'm not trying to broadcast that to the world, for example

You know the Qualsys SSL Tester tool? That has a checkbox about making the result public or not and that is because by default people wouldn't expect the domain to be leaked (even though it's already public information that doesn't really matter)


It's a fair point, and I've thought about this when making the stats public. There are plenty domain enumeration tools out there. DNS data is inherently public, so I don't think one should rely on it for their security model.


I agree with you completely. I just want to add for people that do care about domain privacy that there's many certificate provider that do publish the generated certificate over https://crt.sh/ . I did find a few staging subdomains in the past on that website.


As a side-note, as soon as I added some DNS A records pointing to a testing server, it started getting spammed by bots trying to hack into the server (accessing all sorts of random paths that might be exposed, like .env or common PHP files). Not sure what the solution to this is, apart from not adding DNS records for testing servers, maybe only allow access to the server from specific IPs? But that makes testing harder, if you send a link to someone with a dynamic IP.


I'd be surprised if it was caused by adding the DNS record, and not for example you visiting the test site. In any case, sooner or later those attacks will come, so omitting a DNS A record won't cure it.


How would me visiting the site ping the bots about its existence?


What if someone looks up an internal domain which is actually publicly exposed. Granted they shouldn’t be secret, doesn’t mean you should broadcast them publicly.


Internal domains generally resolve to private IP ranges, so don't do any harm when broadcasted. Besides, many people use Chrome, and who knows what they do by default with browsing history.


Information leakage of any kind potentially gives an attacker something they can use. On its own any given piece of info may seem harmless, but when combined with other info it can start to pose a risk.

Leaking internal hostnames could allow an attacker to build a view of an internal network ahead of time, before actual penetration. Once inside a network it's often a race against time, so this can make a difference.


You’ve obviously never seen orgs with /16 IP ranges, or developers/ops accidentally screw up, for that matter.

Chrome don’t actively publish nslookups on public sites.

I think you’ve missed the point.


FYI, WhoIsMyISP.org tells me the name of my city is my ISP, "also known as" the actual ISP name.


I broke nslookup.io when I entered one of my emoji domains :)


Which one? Because I explicitly added support for that and tested it.


Have you considered signing up as a Brave creator? Could generate more income from those that have auto contribute turned on.


Do enough people use that browser to make it worthwhile?


No.


I can confirm. I earn about 2-3 BAT (€1.5 - €2) each month for nslookup.io (without serving Brave ads) and whoismyisp.org (with Brave ads) combined. That's ~1M page views across all browsers. Not lucrative at all ;)


Oh, only 50+ million monthly active users as of January 2022 - who will both earn BAT and, by default, contribute back automatically...


BAT isn't turned on by default so the number will be a lot lower than 50 million. And I expect most people to turn off auto contribution to keep the money themselves or manually contribute to sites they like.


I have. In fact, you can check this for any site using nslookup.io ;) (so meta) See the TXT records at https://www.nslookup.io/dns-records/nslookup.io


Congrats on quitting your job! Thats a goal of mine as well, and I really like NsLookup and how the page is structured. I have used domainbigdata.com a lot and like how they show Other TLDs.

I've been trying to find tools that would show other domains that are owned or associated with the domain that was searched. I work with a lot of sales leads and I think having information on other domains a company owns is super powerful for mapping companies and also understand company acquisitions, most companies transfer ownership of domains when they are purchased.

I have a question and ask:

1. There are quite a few tools online that do the same thing, long term what is your goal to differentiate? or where do you want to focus?

2. Other TLDs or Other Associated Domains, is this something that you would be able to find?

3. Can you determine from a DNS search if a domain is owned by a company or by a non-corporate entity (Blog, personal)?

Also from your blog post, I don't know if you were looking for any responses but:

Plan B - Stage 2: Sell an API to visitors.

>I would purchase and use this for a email verification process. I would rather go with a smaller company than large

Plan C - Stage 1: Make a one-off DNS dataset, give it away to founders, and find out how they'd use it and what they need.

>I would use this with Email Domain verification and Email format Verification. I worked with data teams that had to find emails formats & domains, if they had the domains it would save a ton of time and cost. As an end user how I think about this cost is I if I find a domain and run a MX or DNS check on it I will save the results and try now to run it on that domain again. That is why I like this plan, but the API would be super helpful as well.


Thanks for your kind words :)

> long term what is your goal to differentiate? or where do you want to focus?

Most other tools have clunky UX, dated design, or are painfully slow. I'm differentiating in ease of use, clarity and speed. Still a lot to be done, but I think I'm already doing okay compared to other sites.

> Other TLDs or Other Associated Domains, is this something that you would be able to find?

Since GDPR, nearly all WHOIS queries get their owner details purged. You also can't link it based on resolving IP addresses, as many sites use a shared CDN. There might be some signals left, but definitely not enough to give any kind of meaningful coverage to make it useful.

> Can you determine from a DNS search if a domain is owned by a company or by a non-corporate entity (Blog, personal)?

No, for the same reason as above. Here you could scrape the site and look for 'about' sections and do some ML, but that'd be quite involved.

> I would purchase and use this for a email verification process.

What would you like to use DNS for in email verification?

> Email format Verification As in, detect 'firstname.lastname' for 'john.doe@company.com'? There's already a SaaS that does just that: https://hunter.io/


> I'm differentiating in ease of use, clarity and speed.

FWIW I've worked for a small company for the last ~15 years and this is basically our differentiator. Most of our products weren't the first in their market, and we don't have all the bells and whistles, but we really focus on making an easy to use, non-frustrating user experience.


Thanks for the validation :) Crazy that lack of frustration can be a differentiator.


Builtwith.com is excellent for domain association. They have some very good common keys like GTM container, GA etc.


A really great goal. I'm trying to help people with this aim with recommendations on what to do, get your business model figured out, etc. https://cxo.industries


Hi HN!

Ruurtjan here, from NsLookup.io. I've read plenty of these kinds of blog posts, and they've been a big motivator for me to take the jump. So I thought I'd share my story here as well. Thanks everyone, for giving me the confidence to give entrepreneurship a try.

I'll hang around today to answer any questions, so ask me anything :)


Thanks and good for you! Keep going.


Welcome to the club!


> Mobile game that divides the earth in hexagons

Interesting that I hear about this twice in the space of a week or so. Neal Stephenson was talking about it on the Lex Fridman podcast too. Apparently it's not possible. You'll always need a certain number of pentagons to complete the tiling.


It wasn't until reading your comment that I actually looked at a soccer ball and worked out that they're not all hexagons. Now it feels stupid that I thought they were all the same shape. Funny how long some assumptions are carried - ie. forever if the situation never comes up to challenge it.

(I live in a country where other sports are more popular than soccer, hence I call it soccer not football, so my familiarity with the soccer ball itself is not as intimate as in many European countries).


If you're interested, here's how Uber solved it for their internal tooling:

https://h3geo.org/

https://www.youtube.com/watch?v=PutOhe8HVNU


Nah, you just cheat a little and the error is small enough to handwave: https://archive.bridgesmathart.org/2017/bridges2017-237.pdf


Classic engineer vs. mathematician mindset


Perfect is the enemy of good vs. wars are fought over disputed boundaries.

Depends on what's riding on it.

It'd be "funny" if whatever gaps in humanity's best tesselation effort are marked as no-go zones, essentially non-existent to the digital / electronic world; Un-mappable therefore illegal to inhabit; there be dragons.


Astronomers came up with a cool way to tessellate a sphere called Healpix. The tiles are quadrilateral, though, so they may not have the same freedom as hexagons in the context of a game where players can move to adjacent tiles.

https://healpix.jpl.nasa.gov/


Some code that might help with hexagons here: https://observablehq.com/@nrabinowitz/h3-tutorial-heatmap-re...


This does not seem to address the pentagon issue though?


Yes, but there exists an orientation in which all of the pentagons are over water (h3 uses this). It's impossible to create a global tessellation with only hexagons.


Check this out then, https://www.ovr.ai, uses the hexagon co-ordinate system to cover the surface of a sphere. My co-founder told me recently that this hexagon system has been in development in the academic world for about a decade.

In our spare time, our team had been discussing on an ideal way to lay some form of "grid" over Earth so we can co-ordinate Climate actions & results between providers and buyers. We wanted a system that has been done before and we can build on top of it.


Got a more specific link that discusses the actual hexagon system? And, your cookie banner is either a horrid dark pattern or broken as heck on ff/android.


Sorry, I have no idea and I have no relationship with the website.


I'm always curious how many visitors you have to get to reach $500 per month numbers from ads. I understand the author probably can't reveal exact numbers here but if anyone knows rough but real life numbers I'd be curious to hear (I haven't ever done ad based revenue myself and I don't trust CPM numbers I'd read by googling, or the spread is just too broad).


Just to compare, https://nodemailer.com had 194k pageviews (42k “active users” as reported by Google Analytics, whatever that means) in January and the BuySellAds payout was $156.45


I've used nodemailer. Thanks for building it. I have similar stats on my personal project and those earnings are discouraging. Any luck with donations? I've used ko-fi with limited success.

Thanks for putting some numbers out there.


Github sponsors payout in January was $159 and one-off donations via Paypal $15. That makes the grand total of $330/mo. It was a regular month as well.


Very discouraging indeed. I gave ads a try on 6groups.com and quickly ran away from them. As an advertiser (because of some other online shops I run), I also know how much it costs to get your ads clicked on. The payout offered to affiliated sites by ad networks is a disgrace and definitely not worth the space and distraction they add


> I understand the author probably can't reveal exact numbers

I do :) Here they are: https://plausible.io/nslookup.io Last month was exactly 400 USD from Carbon Ads. It does depend a lot on ad placement and target demographic, so your mileage may vary.


The spread is broad because the type of visitor (market), ad network, time on page, type of ad and amount of ads makes a huge difference.


it's interesting that the novel/hobby projects all failed.

the ones that succeeded, are those that can be used for marketing by other companies (whoismyisp) or which already exist (nslookup) so there is a proven demand but you made a cleaner solution.

i was always into this idea of building mvp to test markets. but the main issue is getting an audience. often success is 98% luck and timing and 2% hard work.


> which already exist (nslookup) so there is a proven demand but you made a cleaner solution.

100% this. I've completely moved away from trying to be novel. Pick a solution that is profitable, but you think you can improve. Execute it better and gain market share.


Maybe this is a "who whould ever use dropbox" question?

I'm curious what is the use case for something like nslookup.io since I can more conveniently get the same info from the CLI?

In any case congratulations for finding a use case!


> conveniently

For many people it's more convenient to do DNS lookups on nslookup.io than googling the syntax and order of dig arguments, opening a terminal and entering it themselves. Plus, it can be convenient to get an overview of the popular record types, instead of doing 7 dig queries yourself.

> the same info

Nslookup.io provides more info than dig/nslookup CLIs: info about the A/AAAA IPs, logos for hosting providers, parsing DNS records, and more to come ;)

> In any case congratulations for finding a use case!

Thank you!


Nice! I quit my job 8 months ago to build free data tools.

https://github.com/multiprocessio


I just took a short look at it and seems very polished. You did it all in 8 months? I'm beyond impressed. Great job dude!!


Thank you!


That's really cool! I'm following you on Twitter to see how you approach it :) It's a similar problem space to what I'm doing with nslookup.io, but you've chosen a different angle. Super interested to see how it works out for you.


Your traffic is incredible. I would work to understand the business use cases better and create a B2B niche service tailored specifically to one or more of those (API, more detailed info, custom solution, etc.). I don't know that growing your traffic overall needs to be a target. Rather, catering to an existing segment and/or growing that segment may be a better use of time and resources.


I hope that you make more than your salary, how did you attract users to website? Its very hard to get users to visit your site.

I have posted my site (https://text2db.com/) on hackernews, product hunt and reddit for getting reviews but got very few views (guess the idea is bad?), please do recommend me tips on how to increase traffic for site.

Thanks.


> I hope that you make more than your salary

Not yet by a large margin, but it's going up slowly :)

> how did you attract users to website?

Currently, mainly through SEO. My stats are open [1], so you can see the percentage that comes in through search engines. In my case, there's a large search volume for 'nslookup', which is currently the largest chunk of traffic. An 'exact match domain' helped for sure. But in the end, Google will be able to gauge how well you're able to solve the problem that made people search for something. So an excellent product * the search volume * backlink profile * how well Google can understand what you're solving will approximate the number of users that visit your site.

You can use tools like ahrefs [2] (which also has a free tools) to get a sense of the search volume and competition on those keywords.

Hope that helps!

[1]: https://plausible.io/nslookup.io

[2]: https://ahrefs.com/


I used to do a lot of research on SEO and the methods behind the madness of getting decent organic traffic, and it basically always boiled down to quality content built around targeted keywords, phrases, or related topics.

The way I always imagined to accomplish this was to write or hire someone to write this content and present it in a blog format. It does not seem that you do this on these properties, so briefly what kind of SEO optimization can be continually done outside of that? Are you still writing content as a guest poster and generating reputable backlinks? Engaging communities that are focused around your topic and promoting through those channels in hopes of generating backlinks? Or am I just in left field with all of this and what I thought I knew about SEO?


I think you have a pretty good idea on how SEO works. I try to do multiple things, and hope some of them bear fruit.

1. This very blog post is part of the SEO effort. A temporary influx in traffic, a couple of backlinks, and a larger following on Twitter (+80 followers from this post alone).

> The way I always imagined to accomplish this was to write or hire someone to write this content and present it in a blog format.

2. I've got an ex-Microsoft employee that writes content on a freelance basis. We do around one knowledge base article or blog post per month.

3. Original content that's genuinely useful and bring something new to the table (like [1]) take a lot of time, but pay off in terms of backlinks. The image by the way is release under creative commons with attribution, so I occasionally look if websites use it and ask for a backlink.

[1] https://www.nslookup.io/learning/dns-record-types/


Awesome, thanks alot. :)


Don't let one random person dissuade you but my take is that while you consider your text format simple its one more thing I'd have to learn the structure and symbols for.

An other issue you have is that if someone understands the concept of foreign keys already then they've probably already figured out basic create table statements. Lastly, learning a new tool for something that is just a blip (table creation only) doesn't seem worth it.

So your audience for something like this is people that sort of get what a database is but don't know create table statements or people that struggle with tooling enough that an easier mechanism might be useful. I'd suggest pivoting to something like a spreadsheet2db tool. Where filename == database name, sheet name == table name, column name == column name. And the spreadsheet approach lets people actually provide data also. You can also roundtrip and have db2spreadsheet. Once you start supporting rows of data instead of just table creation you open up a lot more use-cases. Plus a lot of people have basic spreadsheet skills.

Everything above is completely off the cuff and I've done zero research to valid the idea. I know there are some integrations in Excel and some tools like SQL Developer partially support this. Another options (to support data in addition to table creation) might be to look into supporting markdown or restructured tables. Might be cool to be able to prototype a database from a basic doc page, markup2db.


I disagree partially. I like the idea of a simple way to get a barebones schema, and although I'd probably rather use a simplistic GUI than text, it's not bad.

My bigger problem is that the website feels unprofessional in a way that if I found it off a random google search I'd probably not come back:

1. Part of the website is a screenshot of text rather than actual text for seemingly no reason?

2. The examples aren't raw text and include HTML list elements which makes copy and pasting them awkward.

3. The rest of the website really needs to be run through a grammar checker and proofread.

4. The output of the tool is hard to read and poorly formatted.


Thanks that you like it.

> Part of the website is a screenshot of text rather than actual text for seemingly no reason?

I shall fix it.

> The examples aren't raw text and include HTML list elements which makes copy and pasting them awkward.

I made it in list form ,so users can understand that which symbols are table/column names. I can provide copy button on samples.

>The rest of the website really needs to be run through a grammar checker and proofread.

Sorry, English is not my native, but I shall try to proofread

>The output of the tool is hard to read and poorly formatted.

I shall add space between lines, you can also stretch the text area, based on your requirement.


Just WOW, amazing ideas you got there, I shall look into it , Thanks alot Dear.


I’ll bite.

Your opening paragraph is full of bad grammar and punctuation errors, which will instantly turn many people off.

> Reduce your time to create database , write down your database name,tables and foreign key in simple text with pre defined symbols and convert it into database script. paste that script in your development environment


Thanks, I like that bite :) .

I have re-written and corrected the text. Please check now? .


It's so nice to see these examples are increasing every now and then. Hope to learn a lot and I also would like to know about the use cases of nslookup dot io. One thing that's coming out of my head would be if I would ought to make a non-targeting crawler of some sort and not implement my own dns crawling infrastructure and just use nslookup.io and just cache the results.


If you like these kinds of stories, check out indiehackers.com. I took a lot of inspiration from them, and was part of the reason I made the jump.


Why is he saying he created Project Euler? Colin Hughes created Project Euler.


You're the second to misinterpret that, so I'll reformulate to make it clearer. I did project Euler exercises as a learning project.


What's a good information source for this? Like is this just something built on the domain tools api? Or is more of a front end for dig?

Congrats though, great looking site and it's super snappy.


Right now, it's just a glorified Dig wrapper. I'm slowly adding features that Dig is missing, like IP information (through ipinfo.io), favicons and logos.

> Congrats though, great looking site and it's super snappy.

Good to hear, thanks!


Great domains, and the service is straight to the point. Good luck!


Thank you :) Even though it's straight to the point, there's so much more to do to help people fix their DNS issues faster. I'll try not to compromise on the simplicity while adding more information.


Kudos to you! I wish you all the success in the world. I greatly admire people who take this path and aren't trying to be the next unicorn.


Thanks Jason, I appreciate it :)


Hey. I'm cheering for you. Wishing you a lot of success! :))) Hopefully the next post will be about how you now make 4000 a month.


Seems like the kind of thing that's a lot more feasible in a functioning welfare state like NL.


My job is to make web tools that people use for free. I can set my own hours, live whenever i like, and someone else shoulders the stress of keeping a buisness afloat.

Its the world we live in. Once you break free of the box, your parents dreams, you can realise your own version of what "life is".


What are you working on? Please reach out on Twitter to join our web tool makers community :) https://twitter.com/Ruurtjan


gazelle-kopen.nl

Would have expected this could rake in some SEO traffic though, but surprised that didnt work for you. Or maybe surprised isnt the right word, maybe i have the initial high hope for it when you registered it and tried it :)


Maybe I should have stuck with it for longer, or just made the site better. The idea was that the official Gazelle site listed their dealers behind a postal code search form, that couldn't be indexed by search engines. So I scraped it and presented each one on a separate page, with overview pages per city. Didn't manage to get a lot of traffic, but I was also super new to SEO back then.


I like the simplicity and cleanliness of the designs on both sites. Nice job.


Thank you! Nice to hear that one of my explicit design goals worked out :)

In my opinion, these kinds of tools, which you only need once in a while, need to take as little cognitive load as possible. People have something to get done, don't get in their way.


The way I read that title makes me think he quit a "real job" (meatspace) to work in FAANG not that he was already a software developer.


Is it hard making a living in the beginning?


I’m not yet making a living and it took me years to get to this point. I think I’ve got a reasonable shot to make a living off this now, though.


Please tell me how do you make money with nslookup? (because I have dozen other commands I'de like to menetize :)


I run ads on nslookup.io (400 USD/month) and an API (100 USD/month). Stats are here: https://plausible.io/nslookup.io




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: