I'm not an expert in these sorts of things, but even that is probably asking for trouble (though considerably less so.)
What's worked for me in the past was to generate a random string each time I create a session for the user, which is valid to create exactly one session for the user. That string is consumed with each use and a new one is generated and saved to the cookie (which again, is good for the NEXT login.)
I'm sure it's also far from perfect, and causes potential havoc for users switching devices, and that sort of thing (though, where I've applied it, that was considered a feature, not a bug -- YMMV).
Back on subject, Pen.IO looks money, but I'd be worried about running out of page names fairly quickly. Have you thought about tying those to an account? bmelton.pen.io/test isn't quite as good as test.pen.io, but in 3 months, I don't like the odds of getting a page name less than 10 characters... and this problem only gets worse as you get more popular.
I wanted to test this and decided on the name "test", which was taken. I tried "test1", then "test12" .. etc.
My point: let me choose if I want a nice URL, and if I decide, then tell me if it is taken. I the meantime I would have enjoyed something like: pen.io/fjS7f
Besides from that: great idea and nice design.
Wow, I didn't know there were so many of these around. I just built a simple one for myself to publish clean notes, and some stuff I wouldn't post on my blog.
I use Devanagari script extensively and have never faced the problem on other sites (FB, GMail etc). I changed the default encoding to UTF-8 but no avail.
Future update idea, if you could collate pages you've created. Instead of it getting lost in the void if you forget the IRL and need it months from now.
Very cool idea, and nice site. I noticed that you are serving your own jquery. I've read that it's better to link to Google's host as it is more likely to be cached (and other reasons). Is this a conscious decision on your part, or is it just a part of the puzzle you haven't wrestled with yet? (Honest question- I don't know the right answer because I haven't wrestled with it yet.)
I consciously prefer serving it myself rather than google because if it's a first time visitor, he will be requesting other static files as well from the site, so having a cached copy of jquery leads to minimal loading time differences (although it does save on bandwidth).
However if there isn't a cached copy of google's jquery there is the overhead of a dns query and new http connection to google.
This compared to the already open keepalive from my static server increases load time dramatically.
First impressions count, and you have few vital seconds to make a good one.
You shouldn't have to depend on CDNs as there's no guarantee they'll be up all of the time. I usualayy have a local fallback that can be triggered in this way:
The second script looks for the jQuery global object that should exist after the CDN fetch. If it doesn't exist, it knows to get your own copy.
(If you're wondering "hey, where's the 'http:' part in that src attribute?", it's because it's a safer way to ask for a resource when you don't know if the you are under http or https.)
Also, you should try to place your <script> tags near the bottom of the <body>, rather than the <head> so that they don't block the rest of the page from loading/rendering.
The point made (below) by @dspillet is correct, and important for one very good reason:
You will almost definitely have scripts that you have included after these two that depend on the jQuery object existing (otherwise what's the point in having jQuery at all). So imagine this trick wasn't used, and you just served up a local (or CDN hosted) copy of jQuery, then started using it in later scripts. It's reasonable at that point to assume that jQuery exists - which is because each script blocks, or if it doesn't, the browser itself will still make sure they execute in order. So it's perfectly safe to use this script without worrying about the order of things.
This is exactly why I (and many others) suggest that you put all of your <script> tags at the bottom of the <body> element. They block page rendering, so if they're in the <head>, or dotted around the <body>, they're going to delay the presentation of the page to the visitor.
I believe all browsers block execution of the script (and rendering of other proceeding content) so his code should work generally.
Even the latest browsers that do not block further object (scripts requests during the download and execution of the script will execute scripts sequentially, so his check for "is jQuery present" will not fire until the external script has either returned and executed (so the check passes, and nothing else happens) or errored (so jQuery is not present and the document.write executes, making it load from the local resource).
I can't use external CDNs in my day-job as our clients require certain audits that I doubt the CDN would agree to, though that isn't a problem for this project.
The reason I server my own jQuery (rather than using the CDN-with-local-fallback option given in collypops' reply) even for my own personal projects is the paranoia of not wanting to trust code from an external source. OK so Google's CDN (or any of the other players) is much less likely to get hacked than my personal servers, but their CDN is also much more likely to be the target of a DNS poisoning attack. If an attacker manages to convince many people's machines to send requests for jQuery to them rather than Google via DNS poisoning then any site using jQuery could have unwanted code injected - if I serve my own jQuery file this risk is gone (unless the DNS spoofing attack targets my domain names specifically, of course, but I'm not a big enough fish for anyone to care to try that).
One reason not to use CDN-hosted common scripts is to not share visitors statistics with the CDN. Believe it or not, but it is a valid concern for many businesses.
When it's cached, your browser won't make an HTTP request to the CDN. That's the entire point.
They can do some VERY ROUGH back of the envelope calculations to figure out based on cache-expiry headers and number of requests how many new people you are bringing to JQuery but not much else. Dan Kaminsky proved this earlier in his DNS/TTL cache sniffing tricks.
And by the time you are large enough to have an impact, your audience will be large enough for you to justify using your own JQuery hosted URL.
I'd recommend reversing the workflow like http://min.us.
Accept the content first, then authenticate when users try & save. It removes a barrier to entry, and for people just testing, doesn't waste subdomains.
I'd also recommend ditching the subomain for a subdir. Regular people don't really get it. Yes, there are major services that do it, but i know from experience that social networking has trained average folks for years to use subdirs vs subdomains (twitter, facebook, myspace).
I think you should give them out editable pretty URLs made from title of the post such as pen.io/this-is-a-test-page instead of sub-domains. This will also help you with the SEO.
Assume that people will want to build lot of pages. Don't let the lack of sub-domains hinder this. Plus, in future you can let them build a blog or something from this set of pages. Advantage you have is you are letting them start-off with minimum resistance.
you can have: this-is-a-test.pen.io plus with the current model I have you can also have: this-is-a-test.pen.io/page/1 etc. So actually people can create more pages.
Would be cool to add a grouping by hashtag function.
So I choose the page name for my first entry to be Test123, and I also add hashtag #StartupPosts
Then later I make an entry called BlahBlah1 and hashtag #StartupPosts , it groups the two together. Then users can search by Hashtags to find posts.
Also, common hash tags would allow people to search content from multiple users. For example #Religion would have a bunch of religion based posts, from different users etc.
It's probable that Firefox is a little "smarter" about content encodings than to simply rely on what the document claims to be. The site is being served up as text/html, so despite his XML declaration at the top I believe the spec says to fall back on ISO-8859-1 in situations of uncertainty and that is also the default charset for text/html.
I like this, however, I'd rather this be something I could deploy privately on my own web hosting. I've been looking for a simple cms system when I write plain text files and something like this generate all the pagination, layout and so forth.
I can write one obviously, but I was hoping to find something that did the layout like pen.io does. Jekyll is close when I looked at it. Maybe I should revisit.
Would be cool to have a link to help. I ended up accidentally deleting the intro text that comes as the default page content and couldn't figure out how to make a new page.
Ugh... input elements are the worst when it comes to compatibility across different browsers and operating systems. I've wracked my brain for hours working out various techniques to get a simple newsletter signup for working right across the majority of my visitor's platforms. I think anyone who has dealt with that can feel for the guy. ;)
http://cookie.pen.io
I just stole your password. :)