If you built this out so it handled multiple locations, browsers, pages, etc. we would pay big bucks for it.
And why every tool on HN now is called a startup?
Webkit inspector is a great tool, but it doesn't track performance over time, which is the goal of Slowcop.
PS: running slowcop on slowcop.com yields a few areas of improvement ;-)
One issue is that I'm going to forget about your service by tomorrow. I only optimize my website when I make significant changes to the design or template, which is just a handful of times per year.
I know some Mozilla folks were working on an automated YSlow tool called Cesium, but progress seems to have stopped there, so I'm glad someone picked up the torch.
Interesting to note that a lot of the suggestions I received were about minifying external JS files. It is kind of ridiculous how much external JS every page has now.
Only complaint is that I had to click to expand out the page speed problems. You might want to add a horizontal triangle or some other visual indicator there is more to look at. Either that or expand them all by default but allow people to close them.
Only issue was that it suggested I could minify my JS and gain a 0% reduction in a few cases.
You're right, there's a lot of noise in the results. If the size reduction is small, like a few bytes, there's no need to show that.
Something useful to add would be links for the compressed images you use to work out how compressing the images could save space. A side by side comparison would be pretty useful so I could see how the compressing changes the look of the page.
Just for fun I ran it over http://duckduckgo.com which comes back with 100. Google comes back with 98. Interesting.
How about this one?: Make it game-ish. "Your rank of 88/100 means your site loads faster than 75% of the sites we've tested," or "Congratulations! You've unlocked the Road Runner Badge!" Make the following improvements to unlock Speedy Gonzales."
Next features that would make it very useful to me are (in order):
- Display timings on the timeline (in Chromium I can't see them, maybe display them on click or hover)
- Recurring checks (of course this is your core. You are already working on this I imagine)
- Different locations in the world (including being able to slice up my reporting based on the location)
- Custom alerts on specific urls (url X cannot take more than Y seconds to load inside my page, beyond more classic ones like total page load time and such)
- Hot cache-Cold cache
- In case of alert also generate a tcptraceroute and compare it to one that is collected every X minutes.
- Ability to set the host header separately (so I can use the IP address in the site url and the host for a specific virtualhost, this is useful when a site is geographically distributed and you just want to cut out the DNS lookup).
Also have a look at many of your potential competitors like Gomez.
Gomez is probably the largest competitor. They do a lot of enterprise sales, which isn't something I'm planning any time soon.
It does have the same problem that all the other speed checkers that I've tried. When you have a decently optimized page, most of the errors or problems it founds have to do with external services over which you do not have much control.
For example, facebook widgets and google apis (analytics, charts, ...).
It is always a little frustrating when you are told to change something that you can not really influence:
In the Resource Timeline, when hovering, it would be nice to have the exact millisecs in addition to the existing proportional colored rectangles. The absolute total time could be added on the black hovering div on the left.
Thanks to the tool I discovered that the DNS time of my domains was far from perfect, thanks! (my host in on amazon EC2 west, but my DNS is french Gandi.net...)
Also nice would be the performance on reload (ie with a hot cache instead of a cold one).
Otherwise looks like a great tool.
(FWIW I'm an avid Go programmer and filed an issue on httplib.go when it was broken by release.2011-02-15. Thanks for the quick fix!)
The main problem with using YSlow is it doesn't show performance trends over time. One of my goals for Slowcop is to give a dashboard where you can track performance across deploys.
Also, there are a bunch of other tools and features I'm planning to add, like measurements from different regions and tools that track lower-level HTTP issues.
I dislike Y!Slow though not many good alternatives. This is a good contender.
My own suggestion: make sure to reference relevant tutorials in the 'Academy' section from within the reports. I made a couple of reports before really finding the 'Academy', which could be a very valuable resource.
- On the report page, call to action should be highlighted more.
- Call to action may be positioned at the bottom of the page. Since I immediately want to scroll down to see my site's result, I will most probably skip the one at the top.
- Graph does not show time (at least for me). Just colored bars and the legend.
- I don't know if you're heading to the page execution side of the problem, but if you'll do it, when you show specific vertical bars, add a hint to explain the significance of that bar (This is where JQuery's document ready fired, etc.) on the graph. Hover popups would be much better for inexperienced folks.
- Allow me to exclude some warnings from the report, future reports. Google Analytic script gets a caching warning but it's given and not something I can improve.
- Results can be collapsed by default.
but this doesn't:
See http://www.ietf.org/rfc/rfc2616.txt, section 3.2.2 for the spec.
Otherwise, looks very useful.
Just a suggestion.
Make it clear that a higher score is better for page download speed. I think this is true, but there's no real indication of this on the page
Some kind of explanatory histogram showing relative performance vs. other web sites would be useful. It might even be useful to group load time by page type and size (landing page vs. web app internal page, for example)
For the tune-up items at the bottom of the page, you've reversed the scale -- high numbers are now less significant than low ones. Also, it seems that a 100-point scale here might be overkill.
For example, under "Minify CSS", you could say: "Need help? Here's a CSS compression tool."
Is this due to HTTPS? This site really should support HTTPS...
EDIT: If I enter "http://grepular.com/ instead, it works. That URL simply redirects to the HTTPS version...
EDIT2: The only listed "problem" for my site is:
Remove the following redirect chain if possible:
If you don't have cookies configured to use https only (you don't), an attacker can grab a user's cookies. They can also intercept your redirect and send users to another site (see sslstrip).
Me putting a redirect on http to https is no less secure than not providing http at all.
The only thing is that it really is into minifying the css and gives high numbers of potential savings. It'd be more helpful to display a number that takes gzip compression into account. Something like "Minify this css to save 31% (5% after compression)"
Question though - the page loading times are measured by... WebKit? Gekko? Home-brewed page loader?
Maybe Rackspace is hiding something :)
I especially like the 'try it and then sign up afterwards' aspect.
Obvious UI, works fast presents the information in a very clean and clear format.
The blog & academy pages could do with some of the polish of the main site, but that is understandable! I also wasn't totally clear on what the NN/100 numbers represented on the report page?
1. The "generating" process didn't do anything until I hit refresh, then everything appeared.
2. The "Forward a copy of this report via email" link doesn't do anything? (I'm in Chrome 8)
3. Reading the improvements under each header could be formatted in a much more easy to read format.
The forward link should work. The report page has a lot of content, and forwarding is done with a lightbox. Try again?
What about the firebug net tab, or yslow or chrome resources?
I've always used Pingdom and YSlow. I would consider spelling out the differences so users know what your USP is.
Am I on Hacker News? I had to check the site header real quick.
Suggestion, try changing your URLs.
by doing this you will get a lot of link traffic.
Might I therefore suggest a hybrid.
will run the test on www.slowcop.com at the time the link is clicked, but have a way to retrieve a permalink URL to that particular test which is like:
so you can reference old tests for comparison.