

Show HN (again): Web-based Link Checker with Blacklist Lookup - awulf
http://drlinkcheck.com/

======
mcknz
Looks good & works well, based on a couple of sites-- unsure on the doctor
image. It's funny but it might prevent some people from taking the site
seriously.

\-- remove "example.com" from search box when gets focus?

\-- rather than blacklist certain domains, maybe you could cap the number of
link checks? With utilities like this I always try a popular site to see how
it works, before entering one of my own sites. I was unable to do that, which
was a little frustrating at first.

\-- add a quick description of service on main page? Not sure if "check your
website's links" would be obvious to everyone.

------
tlack
If you're looking for feature ideas, it might be interesting to be able to
tell how deeply buried each link is from the homepage. I have some sites with
a ton of content and I'd love to know how easily reached different pages are.

------
symkat
Looking at my logs I only see 5 requests (maybe I'm grepping for the wrong
thing?) but the UI shows many more links checked.

    
    
      grep www.drlinkcheck.com  /var/log/nginx/access.log | wc -l
      5
    

All of those have the user agent 'Mozilla/5.0 (compatible; dlcbot/0.1;
+<http://www.drlinkcheck.com/)>

Also, I think it would be useful to add a section in 'About' that says the
user agent the crawler will use.

~~~
awulf
That's strange, your grep looks fine to me. Is it possible that you have
configured nginx to use different log files for different locations (or maybe
don't log at all)? Other than that, I don't have any good explanation.

~~~
symkat
Oops, I'm a dumby and hadn't had enough coffee yet. That file was just logging
301s, and the actual target was going elsewhere, which has an appropriate
number of requests.

Overall pretty awesome, thank you!

------
awulf
This is a repost from about two weeks ago. Back then, I had to remove the link
after a few minutes because my little server couldn't handle the traffic spike
and ran out of memory.

During the last days I have added more server power and did some major changes
to make the service more robust. I hope the work was worth it.

~~~
hsmyers
Don't much care about links that are not broken---do care about links that ARE
broken. Need some way to create a list of such and then down load the list.
Like the interface and the look and feel---good work, keep going!

~~~
awulf
To filter the list, just click on "Broken" in the "Link Status" box on the
right.

------
moustachio
I put together a usablity review of drlinkcheck.com using moustach.io. I hope
you find this useful.

[http://moustach.io/welcome/e/reviewed/zNQi0Ddm0hFnDC4uEKvPYL...](http://moustach.io/welcome/e/reviewed/zNQi0Ddm0hFnDC4uEKvPYLt56cIyT3CkS_8doD1Ql9c/nHkFS62xP1E7L2N60B_9Sg)

Good luck.

------
Mamady
The service is a good MVP - now it's time to beef up the features.

I have a friend who is working on a very similar product - it will be launched
sometime in the next few weeks... keep an eye out for competition ;)

------
logicalmike
Works very well. And nice touch on the auto-updating pagination.

------
darwindeeds
Very useful little tool. Now find a way to monetize it :)

------
jaequery
Looks neat. But where this can be used for?

------
edbloom
nice work! I see you're running nginx as your web server. Anymore details on
the dev stack you used?

~~~
awulf
It's an ASP.NET MVC 3 app running under Mono on Linux. The main components of
the stack are:

\- nginx

\- Mono

\- ASP.NET MVC 3

\- FastCGI Mono Server

\- MongoDB

~~~
edbloom
as someone who has a good bit of asp.net and nginx experience - i never
thought of using nginx as a proxy for ASP.net running on mono/linux - cool
setup dude!

