
Jekyll CMS on Amazon S3 and MaxCDN - jdorfman
http://blog.netdna.com/developer/supercharge-your-site-with-jekyll-s3-and-maxcdn/
======
lacerus
After using Jekyll for two years with lots of hacks to have HAML and others, I
switched to middleman, which has everything I need out of the box:
<https://github.com/middleman/middleman>

~~~
lukeholder
I have also. It really is wonderful.

------
sudonim
Current MaxCDN customer here. I'm really excited about this. I host multiple
sites on Cloudfront but assets for a rails app on Maxcdn. I wanted to
consolidate about a month ago. When I asked MaxCDN support why I couldn't use
www, they just said no and that I should move to netdna (their other brand) to
do it.

From the blog post:

"Note: MaxCDN does not automatically allow you to create a www CNAME in order
to protect their system from DDoS attacks. To enable this functionality, email
MaxCDN directly and mention this blog post."

I'm going to ask again and see if I can test against the experience of
deploying to cloudfront. One of the things now with cloudfront is it often
takes a bit of time to expire the cache.

Edit: Here's the response I received 5 mins ago from support after asking for
this...

"Thanks for emailing us and I do apologize for the delayed response. I will
need to escalate this to our support engineers for verification since we don't
allow the use of "www" as part of custom domains for security purpose and also
this needs higher level of access."

~~~
jdorfman
@sudonim email me your account and I will enable it for you: jdorfman at
maxcdn dot com

~~~
sudonim
Awesome. Thanks for your help.

------
pclark
Am I missing something or is Jekyll an outrageously user _un_ friendly
blogging platform?

I have to create a text file, specify the layout, ensure the timestamp and
title is in the filename, and then write HTML.

I am stunned no one has made a simple scaffold that lets me create a post with
a button click and then use markdown to write, on a simple web app.

~~~
sprice
You're looking for <http://prose.io>, built to work with Jekyll.

~~~
straws
OK, this project and that Substance editor are amazing.

------
kevinSuttle
1\. Jekyll isn't a CMS per se. (I know this is a nitpicky thing) 2\. GH-Pages
actually performs a ton of server-side optimizations.

I use it, and my site currently scores 95 on YSlow. You can't use .htaccess on
GH-Pages, yet, this is what they do for you: <http://d.pr/i/81zP> Note that
the grade E in that is my fault. I have a favicon URI, but I never uploaded a
favicon. Everything else is optimized automatically.

------
hayksaakian
How does Jekyll + S3 + CDN compare to dynamic rails site with russian doll
caching (cache digests) using something like redis?

~~~
benmanns
You'll likely get similar performance as long as your comparison is between
Jekyll + S3 + CDN and Rails + Caching + CDN. Otherwise, the worldwide caching
in CDNs will outperform the Rails app in a single region.

------
kerno
We're very interesting is maximising site speed (which we are currently
reviewing - too many plugins slowing things our Wordpress site up) but I don't
think we could easily manage to produce a new post everyday using a static
site generator - am I wrong?

~~~
dbaupp
What problems do you see that would stop you from posting everyday?

------
eli
Any particular reason to use MaxCDN over CloudFront?

Edit: just realized why that's a silly question. Would still be curious to
hear the answer though.

~~~
ccorda
I made this decision just this week, biggest reason being gzip compression. S3
will only store your file compressed or uncompressed, and you have to do
server-side gzip detection to request the proper version, which isn't possible
with CloudFront. [1]. You also have to gzip compress locally and upload both
versions.

With MaxCDN, I just upload uncompressed, and it will gzip upon request certain
text file types (xml, js, css, etc.) [2].

[1] [http://blog.kenweiner.com/2009/08/serving-gzipped-
javascript...](http://blog.kenweiner.com/2009/08/serving-gzipped-javascript-
files-from.html)

[2] <http://www.cdnplanet.com/compare/cloudfront/netdna/>

~~~
HeyImAlex
I feel like this is almost a non issue unless you're running a site that
_absolutely needs_ to run everywhere since the last browsers that shipped
without gzip support were released in what, 1998? A significant portion of
your users (according to O'Reilly's 2009 _Even Faster Websites_, around 15%)
don't say they can support gzip even though they can. Just ignoring Accept-
Encoding headers and serving only gzipped content is probably good enough for
99% of people and likely gives you slightly better overall performance on
average than actually respecting whatever the request headers say.

~~~
jdorfman
@HeyImAlex A week ago I would have totally agreed with you. I asked a friend
and he told me "replace the user-agent with a misconfigured upstream proxy and
you run the risk of serving an uncompressed version to end user that supports
gzip, and vice versa."

------
tantalor

      On the Obama campaign we made our donate pages 60%
      faster and got a 14% increase in donation conversions.
    

Did they test this? For what purpose? We've known this is true since 2006.

[http://www.zdnet.com/blog/btl/googles-marissa-mayer-speed-
wi...](http://www.zdnet.com/blog/btl/googles-marissa-mayer-speed-wins/3925)

------
the1
or just start blogging on gist.github.com. your blog url is
[http://gist.github.com/<your-github-id>](http://gist.github.com/<your-github-
id>);

------
molecule
Boto is awesome for interacting w/ AWS, but managing an application stack in
ruby (jekyll) and a deployment stack in python
(<https://gist.github.com/4596766>) does not seem optimal.

