For most people, the URLs these jobs are hitting will be calling code that they themselves have written, is that right?
If so, since I'm writing a bunch of code to do work, the marginal cost of dropping a call to that code into cron is awfully low. I'm not sure why I'd bother creating another username and password in an external system, much less pay for it, to provide such a small amount of incremental value.
I'd understand if the idea was to offer the power of cron to the non-techie laymen, but I'm not getting the sense that's the target.
The "you don't have to worry about [foo] because we manage it for you" value proposition only makes sense when foo is hard to manage. And, for techies who can edit a crontab in 3.2 seconds, this might even be a step backwards in terms of productivity.
You're assuming you have access to cron on the machine that code is running. Lots and lots of websites are running on shared hosting, where there's no such thing as cron access.
This is the perfect example of something that could be run on an always-connected, dual-core Atom machine (~$200) sitting in one's office. I don't understand the insistence on using a paid monthly service for something so simple. My crappy consumer router at home runs cron jobs.
Well the problem with that is, what happens when the atom machine dies. Or the ISP cuts you off? People outsource services not because they can't make a cheap version themselves but because it can be a hassle.
Ok. What happens when the 3rd party service is acquired and shuts down, or shuts down due to lack of funds? I've got several machines that have been running non-stop for 3-5 years, and I've had several services I've relied on that have shutdown after being acquired, so there seems to be a need for a backup solution either way. People that derive their income from providing 3rd party services seem to have a bias in favor of 3rd party services for everything.
Edit: Why am I not surprised to learn that you work for a hosting provider? I'm not trying to pick on you, I've just noticed a trend that makes little sense to me.
I'm not the person you're replying to, but someone that disagrees with you.
I run multiple servers, at this minute there's 8 web servers online and 1 mysql server. Each server has a different purpose, some are long term and some are short term (taken offline after purpose served). Having to manage the cron jobs running on each server is a pain, having 8 servers all running different things is fine if you have the time and want to save money, but for me I'd rather pay $20 a month and outsource it to someone else. It's the same reason I use Postmark (http://postmarkapp.com) for my email. I could manage email myself but I value my time, same deal here. Same reason I use google apps for personal email, same reason I use Linode for servers instead of buying hardware and colocating.
Time is my biggest constraint not money, so spending an hour to save $20 isn't worth it.
>Having to manage the cron jobs running on each server is a pain
Agreed, that's why you would use a single, cheap Atom box sitting under your desk managing all of your cron jobs.
>spending an hour to save $20 isn't worth it
This isn't saving $20, it is saving $240 per year, so having your own hardware is break-even for year one. Year two is essentially free.
>it's the same reason I use Postmark [...] for my email
Email is an entirely different beast, dealing lots of config files, blacklists, and a kludge of MTAs/ MUAs. I completely understand outsourcing that piece of operations.
>instead of buying hardware and colocating
There is no reason such operations need to be run from a datacenter/ colo. Any office/ business connection should be adequate to run cron jobs.
The only reason I'm putting these thoughts out there is because I want to understand why things do not work out the way that makes sense to me. When people just down-vote, I can't understand what I am getting wrong. Please help explain what I'm missing, it would help a lot, thanks.
Nothing to do with work. I've been "in technology" for my entire life and professionally for the last 18 years. I have learned that whilst homebrew is satisfying an itch, sometimes there are people who do things better. I've had companies go bankrupt on me and that's why I have contingency plans in place to remedy things, but I sometimes like the fact I can pay someone to do a specific thing for me better than I could do it myself (or at least as well). I'm past the point in my life where it seems useful to spend my time doing something because I can, I'd much rather spend that time on something else (other work, family, friends, hobbies) if there's a viable way that makes good business sense.
The reason, at least in my own personal experience, is that you really don't know what you don't know.
If there was a place to learn everything there is to know about hosting a site yourself, then that would help. Even better if it was easily accessible, trustworthy, and simple to understand. And more so if it displayed proof that you only need minutes to diagnose and fix the most common problems.
An API is more flexible. For example, the code being called could itself adjust the job parameters to fit its needs dynamically. Or you could have a pre-packaged solution that the user can install on a shared hosting server that automatically sets up the cron job.
But from what I got there is only an API and no simple interface, which is everything but practical. The logical steps would be to make an interface first, then develop an API.
I'm a big believer in choosing the right vocabulary. Half of the fogginess in communicating ideas comes from bad vocabulary, and matching that vocabulary to your user base.
For example, "signal" is a great word that describes the behavior. "Delegate" is also decent. "Dictionary" is not a good word. If you don't know that a dictionary is a key value store, it's not very obvious. Same with "hash"; unless you've been educated in data structures, a hash is either a mathematical function or a cannabis product.
Reason I bring it up: We don't manipulate "crons". We manipulate "jobs" in the crontab at /etc/crontab (short for cron table, I believe).
It's an exceptionally tiny nitpick of a really good idea. So keep up the good work and feel free to ignore my pedanticism.
Yes! I currently have a server ran specifically for managing my cron tasks, I would love to outsource it to a reliable company. If you could guarantee reliability then I'd love to use it. Not sure about pricing though, but I'm paying $20/m for the server I rent (Linode) so if you could guarantee reliability and it meant I didn't have to manage anything myself I'd be happy to pay around that or more.
I think you could arguably charge based on frequency. Weekly job -> free. Daily job -> 10c a month, Hourly 50c a month, more than that, pick a price. Definitely useful but then you're also going to need a way to make it a distributed service because people aren't going to want to have it not work because your single Amazon instance has tanked. The number of businesses who have problems because timed jobs don't kick off is insane and this might do the trick but you need to avoid moving that pain point from their own systems to you.
One thing I love about Mailgun is that they keep retrying POSTs to our URL if it doesn't respond with a 200 status code. It adds a nice level of reliability to our email gateways.
I think you could add that here, and it might be a good value add. Depending on pricing, I could be convinced.
Definitely a good idea! The only thing that would make me hesitant to use this service is that you are at the mercy of any kind of network outages between your servers and mine. If I have a DNS outage, a main network node down on the route when the request is made, etc then my cron will not get executed. A super great feature would be the ability to specify a retry flag for individual jobs. So if you attempt to call a cron endpoint on my server unsuccessfully, your service would keep trying every couple minutes until successful (or the next scheduled execution occurs). Otherwise, good work and good luck - surely solves a pain point many cloud devs have!
Would creating 60 jobs polling the same URL at 1 min intervals translate to the site getting polled every second? Or is there batching/coalescing of the same-URL jobs?
This is actually why i took so long to launch the site. I've used the prototype version of this for a long time. The app is written in node so it is incredibly fast, like destroy the internet fast. That is why email confirmations are required and accounts are limited to 5 crons each. I will add a captcha on account confirmation shortly as well.
Alternatively you could get the user to validate ownership of a domain, i.e. if you want to run jobs at http://foo.com/something, tell them to create http://foo.com/cronio.txt with specific contents.
This would be great, there are services like Heroku that provide nearly "free hosting" but have horrible cron job support unless you pay like 10 bucks a month for it. Be nice to plug this into a nice small free app.
Consider this feature which I've thought about building myself at times. With the cron request a token/guid/hash can be passed along with the job which is unique to the request. Then allowing for it to be passed back to a listener api call on your side to simply log that it was complete for user reference.
I'd pay for that feature hands down for crons which I want to ensure run consistently and see a simple chart or data of where the process possibly broke down.
With that a notification after a rule trips, similar to pingdom for downtime.
Interesting idea...however, in my case I keep most of my crons in a directory not accessible through a browser, so that someone can't accidentally come across and run one manually. Am I the only one that does this? I also have crons that run to perform maintenance/clean-up and backup on files/databases (either daily or weekly) and it sound like this service wouldn't work in that case, unless the scripts were changed to a server-side language like PHP and then put in a publicly accessible directory.
I think it also needs to store the output from the page request. If it is being used to centralise scheduled tasks you would also want to centeralise the output from the tasks.
I've been using webcron.org for a few projects, works well enough, but my most important criteria is price for something this simple. It's really cheap, $0.00014 per request.
In practice hopefully you would never have to use curl with the full service. I want to make an npm command line app to make things from the command line simple. Other use cases are libraries for different languages to integrate into your application.
I'd say you accidentally clicked the link twice, or maybe your browser made two requests.
I've used web-based cron services in the past for projects on shared hosting that doesn't offer decent cron support. This would've been a godsend; most are downright terrible or ill-maintained.
I think I'd pay for the convenience and reliability, but I hope you'd have a small pricing option. I'd use <5 jobs at any one time, and probably wouldn't want to pay more than $3/month.
I particularly hate the existing web cron services, like webbasedcron.com and others. They're poorly designed, poorly run, and poorly supported. Just looking at most of their sites you can tell they suck.
I'd pay at least $30 / month for an industrial grade web cron service (with 20 or 30 cron jobs, that can be run per minute, with variable failure thresholds), with a decent interface.
I've thought about building one, but I can't find the time to throw at another project. It'd be great if someone would do it.
Since I'm doing something similar in an enterprise environment I suggest you to add , to each cron, a dates black-list. I mean a list of dates in which you do not want the job to be executed ( Christmas , Easter etc). Kudos for the clean documentation
If so, since I'm writing a bunch of code to do work, the marginal cost of dropping a call to that code into cron is awfully low. I'm not sure why I'd bother creating another username and password in an external system, much less pay for it, to provide such a small amount of incremental value.
I'd understand if the idea was to offer the power of cron to the non-techie laymen, but I'm not getting the sense that's the target.
The "you don't have to worry about [foo] because we manage it for you" value proposition only makes sense when foo is hard to manage. And, for techies who can edit a crontab in 3.2 seconds, this might even be a step backwards in terms of productivity.