Hacker News new | past | comments | ask | show | jobs | submit login
Pirate Weather: A free, open, and documented forecast API (pirateweather.net)
1149 points by calebegg on Jan 10, 2023 | hide | past | favorite | 195 comments



Can confirm pirateweather.net's forecast API working as a drop-in replacement for DarkSky's. I was able to fix a soon to be broken DarkSky bitbar script by just replacing the URLs/API keys to pirateweather's in under 5 minutes.


Thanks to your comment, I also updated an app that had stopped working since DarkSky shutdown, and also had it going in 5 minutes.


> I was able to fix a soon to be broken DarkSky bitbar script

Please fork it for us lazy ones



I assume there's no recourse for iOS users?


If you are using the current version of iOS, the new weather app rocks in terms of functionality (IMHO better than the previous version of Dark Sky).

If you are on an old version of iOS and mourn the loss of DarkSky app, I personally like MyRadar - it has some very nice features and I used in tandem with DarkSky in the past.

If you are after writing an iOS app, you already have an Apple developer account allowing for 500k calls/month.

If you are after writing an iOS app and want to use DarkSky rather than WeatherKit, Pirate Weather should also be a drop in replacement.


The new iOS Weather app certainly has lots of functionality, lots of data, but I mourn the clean, streamlined Dark Sky interface.

Plus the temperature map, unlike the radar map, no longer allows you to change to a future date/time to see how the temperature changes are going to progress across a larger region, which I always found interesting.

Very unfortunate.


Also the live map just looks weird and interpolated now.


I've switched to Carrot and paid to enable "layouts"; they have a very Dark Sky-esque one.


Having just downloaded and poked at it, it looks very nice and familiar for those who want a Dark Sky type interface. The "oh, that's a feature I'd like" comes at a bit of "that's behind an IAP", but I suspect I'll be adding it to my weather section.

Thank you for introducing me to it.


I like Carrot also - it has a lot of options to play with if you subscribe for it - They run sales esp year end sales if you can wait 11 months. You can also change the weather sources or api like Foreca and DarkSky and apple weather as well so maybe this Pirate one will be in there as well. I did not like the free version - I LOVED dark sky it was point of contention with the wife as accuweather was more accurate for planning. Peace.


> the new weather app rocks in terms of functionality (IMHO better than the previous version of Dark Sky)

While I agree that functionally it offers a lot, the interface is around 1000 times worse than the Dark Sky app. For example, to see the “feels like” temperature over the next few hours in Dark Sky, it used to show up immediately on load for the current time, and with a single tap to see it over the next 24 hours. On the new Apple Weather app, you have to first tap on the current day, then tap on the drop down on the upper right, then tap on “Feels Like”, then tap on the graph itself and drag to the desired time.

It is absolutely incredible to me that something so simple went from taking one step to taking five steps. I don’t know how any user interface designer can justify it.


UX note: 1 step = 200 units of worse


iOS Weather is pretty nice. But I can't quite love it, because it frequently tells my wife and I different values for current temperature, all while proclaiming to be giving data for the same small city. I can manually program the city in and get it to do the same thing. Hard to trust it.

Also, it's almost always wrong with rain predictions. But so was Dark Sky, so that's fair.


Yup it’s pretty bad - I kept comparing it to the nws mobile site and the iOS weather app was frequently wrong vs the nws and real outcomes


> If you are using the current version of iOS, the new weather app rocks in terms of functionality (IMHO better than the previous version of Dark Sky).

Tip for anyone coming from Dark Sky: Open Weather, scroll to the bottom, tap "Manage Notifications", and turn on "Next-Hour Precipitation" and "Severe Weather". For me at least, these weren't turned on by default.


I mean, it doesn’t even have precipitation notifications or a proper precipitation graph. Not exactly better.


Notifications - at the bottom of the screen "Manage Notifications". Enable "Next-Hour Precipitation". It is off by default since it leaks data back to Apple. https://imgur.com/CwlFnym

The precipitation graph you get when you expand the perception tile. You can then scroll through the days. Here's a screen shot from my phone for a couple days from now when I'm due to get some snow. https://imgur.com/FnWdQ4b


I don’t have that graph.

Probably yet another feature locked to the US. At this point it is ridiculous that iPhones outside the US don’t come with a 5% ‘reduced service’ discount.


Any particular reason you couldn’t do the same? Just replace URLs.


I think they're asking about continuing to use the DarkSky app. In which case, no. The new Weather app in iOS 16 does add some DarkSky features, but I don't think it's as good.


Totally read “users” as “developers.” Thanks for straightening out my brain :)


Do you run BitBar on iOS?


> Why "PirateWeather"? I've always thought that the HRRR model was pronounced the same way as the classic pirate "ARRR". Also, there is one company out there that thinks APIs can be copyrighted, which might apply here.

This answers the first question I had upon seeing the name, which was, is it free, open, documented and legal? Based on the above the answer seems to be "probably".

Very commendable effort, and I hope the project can last. It seems to be very difficult to maintain a free and reliable weather API so hopefully the dev is not biting off more than he can chew.


> is it free, open, documented and legal?

I think it says something that we live in a world where NOAA data could be seen as an underground or less than legal means of getting the weather.

Perhaps AccuWeather has been successful in their campaign to keep free weather data as obscure as possible.


I think it's more about it having "pirate" in the name that makes one question its legality.


Perhaps the author has a peg leg, a parrot, or a penchant for API plunder.


or they charge $3.14k/hr


great, now that song is stuck in my head all day long. well played.


I wish the NOAA API was reliable. I'd pay for it if they could make it reliable (and they could probably make it reliable if they charged for it).


I'm the dev behind this, so always great to hear things like this! I really struggled to try to come up with a name. My first thought was "Bright Ground" (opposite of Dark Sky), but that seemed a little too on the nose. Luckily, the legal aspect of this (I'm in the clear!) was pretty well settled after the final Oracle v. Google case, but at the time was a big enough deal that it seemed relevant, and the HRRR model was another plus. I should update that README though, since it's now very definitely legal!


Just my 2 cents: It's on my todo list to look for a weather provider, but the "pirate" in the name at first made me discard this: I guess the pirate bay made the word pirate something less-legal sounding, not something I'd want for a business. Happy I did open it, and no problem to work with a pirate brand, but wanted to notice it might be negative branding. Personally I quite like the Bright Ground name joke (and I don't think it will cause legal troubles, but IANAL).


Maybe use an synonym. 'Corsair weather', 'privateer weather' ?


Lol I named my version "Bright Earth" as an opposition to dark sky (brightearth.app)


Error after receiving permission to get location:

Firefox:

  brightearth.app:6:4322  
    XML Parsing Error: mismatched tag. Expected: </input>.
    Location: https://brightearth.app/
    Line Number 6, Column 4322: 2

  fetchWeather.ts:31:31
    Uncaught (in promise) TypeError: l.querySelector(...) is null
      g fetchWeather.ts:31
Safari:

  App.tsx:46
    [Error] Unhandled Promise Rejection: TypeError: null is not an object
    (evaluating 'l.querySelector("creation-date").textContent')
      (anonymous function) (index.2994085b.js:139:2367)
      asyncFunctionResume
      (anonymous function)
      promiseReactionJobWithoutPromise
      promiseReactionJob


Sometimes the NOAA APIs don't return data according to the usual structure. I can't consistently reproduce it. If you can please let me know!


I’m out of the loop. Why would this not be legal? Are you using Oracle technology?


> built as a compatible alternative to the Dark Sky API

OP is talking about the copyrightability of API's (which they are clearly exposed to as they built a drop-in replacement). The relevant case: https://en.wikipedia.org/wiki/Google_LLC_v._Oracle_America,_....


Oracle argued that interface calls were copyrightable. They lost anyway.


I agree, PirateWeather seems like a misbrand here. When I read it, i thought it was stealing weather data or something. On the web, the term "pirate" generally doesn't mean good things. This name almost implies that it is illegal or something.

I'm imagining designing a software product around this and presenting it to a C-Level, explaining that we use "PirateWeather" and I think I'm going to get grilled with lots of questions and concerns based on the name alone.

This is a good service and should be "branded" with a better name. Maybe a play on the whole DarkSky name like LightSky or "Sunset" which works exceptionally well since DarkSky was sunset by Apple. Maybe StarrySky, LateSky, NewSky.

I am usually someone who says that names don't matter as much as people think they do, but PirateWeather just seems like a huge hit in the wrong direction. But the product is solid so maybe it can survive despite the name.


I'm the dev for this, so can shed some light on this! Weather Underground was a pretty out there choice- I guess the kind of people who like putting together weather APIs aren't great at naming things. I considered "Bright Ground", and still have the domain name for it, so maybe it'd be worth spinning up another API endpoint/ branding that uses that and has a more commercial focus, keeping Pirate Weather as the free and open source branding.


FWIW I love the name Pirate Weather. Thanks for the work!


Keep the name - this is not vc bait


I like the name pirate weather a lot.


wunderground.com has existed for a long time. Defining feature it is can crowd source weather


That used to be the defining feature. IBM has killed the community and put alot of restrictions on the data


I wonder if they put the codebase into ClearCase too.


Rrr-mate, Rmate, Are-matey,

And variants thereof perhaps


Weather Underground would be hilarious.


Not only is it already a thing, IBM of all people own them now.


We took the real Weather Underground so unseriously that half of them became college professors and one of them was on Jeopardy, seems consistent.


That would actually be a great idea.


>>I'm imagining designing a software product around this and presenting it to a C-Level

More likely

1. they will not ask

2. If they ask they will care

3. If they care they will not with a short explanation

this seem like a non-issue, and if the biggest thing you have to worry about is some C Level with a stick up their ..... well I think you have nothing to worry about then

the name is fine...


Maybe most of the good products come out of places without too many C level types :)


> On the web, the term "pirate" generally doesn't mean good things.

Maybe we're on different webs, but in my mind "pirate" has good connotations. As in, a rebel, a free spirit, a fighter of oppression.


>On the web, the term "pirate" generally doesn't mean good things.

That depends on who you ask... People who are fans of, e.g., ThePirateBay probably would disagree with you.

Also, there's a shipping service called PirateShip that's totally legal (it's basically a frontend to USPS shipping labels). The website is pretty amusing with the "Arr, matey!" stuff.


I guess at least it’s a step up from the time someone was trying to brand a grassroots weather data collection effort and decided to name it after a terrorist organization.

I’m a massive fan of - and indeed contribute data to - the weather underground project, but the naming has always made me a little uncomfortable.


> I'm imagining designing a software product around this and presenting it to a C-Level

Alternatively, imagine what the world would be like if we spent less time thinking about what C-suite executives think.


Weatharrr!


> Based on the above the answer seems to be "probably".

Why do you draw that conclusion? It seemed to be making a joke about the fact that APIs are NOT copyrightable and considered fair use. The site is pretty clearly fully legal.


As an Australian, my main issue with almost all Weather API's and services is that the data is almost always sub-par to the weather data provided by the Bureau of Meteorology (BOM). It would be great if there was a backend provider concept, so that I could tell my devices (such as an iPhone) to only use BOM data for it's inbuilt widgets.


It’s truly maddening as an Aussie. How many apps just ignore the BoM? I expected the Apple one not to, but it does. It’s frustrating when friends visit me and check the weather on their phone and the data for the Google and Apple apps is so bad.

I use Willy weather that uses bom. I don’t love it though. Any recommendations? I personally can’t stand the actual BoM app as an iPhone user but I appreciate that they tried…

R.I.P Pocket Weather :(


I'm more of a desktop user, but I wasn't happy with what was out there so I made my own (but only for Melbourne): https://davidjohnstone.net/weather/melbourne . I'm tempted to turn it into a proper Australia-wide product. The BoM's data licensing fees (in the order of $5k/year) are a bit of a barrier.


That's really nice! Another commenter mentioned the Willy Weather API, which uses BOM data and has a generous free tier. Not sure if that would enable you to make it Australia Wide at a reasonable cost?

https://www.willyweather.com.au/info/api.html


BoM actually makes it quite difficult to obtain data. I looked into it once and thought it wasn't worth the trouble. Some agencies make it easier than others to obtain the raw data as it's sort of part of their mandates.


I wish they used better open licensing. The public ftp has great free products, including fire danger ratings and observations, but the licensing mentions “Users intending to publish Bureau data should do so as Registered Users.” This costs a couple of grand and removes the barrier to a lot of potential great community apps being built.


And they've made it even more difficult over the past few years. I had written a script that would scrape BoM's page and send the up-to-date conditions via MQTT but it stopped working a few years ago. IRIC, even changing the HTTP headers/user-agent prevented it from ever working again. That said, I understand it's absolutely their prerogative and they have a right to make money from the data.


> they have a right to make money from the data.

I thought it was a govt service. BOM is private?


You'll be surprised that the free-for-all-no-questions-asked US system is not the one followed around the world. Australia, like other countries with significant British history, has the concept of Crown Copyrights (https://en.wikipedia.org/wiki/Crown_copyright#Australia). Whether it's the (UK) Met Office (https://www.metoffice.gov.uk/services/data/met-office-data-f...), (Australian) BoM (https://reg.bom.gov.au/business-solutions/) or (NZ) MetService (https://metraweather.com/), commercial use is paid-for. Personal use is implied to be free, but usually they interpret using their API to be commercial use.


That being said, and I agree with all points you raised, the Australian Gov has been fantastic over the last few years of focusing on open data, to give credit where it's due.

https://data.gov.au/search

Their datasets are great!

G-NAF for example: https://data.gov.au/dataset/ds-dga-19432f89-dc3a-4ef3-b943-5...


To be fair, from looking at the available datasets of different countries, it seems that weather data tends to be the one requiring business-level licensing. There are exceptions like Norway's Met (https://api.met.no/) and US' NOAA (too many to list), but unfortunately the usual answer for "weather data?" is "pay first".


Totally, there are full 3d scans of Launceston available. It's amazing what data you can access!


Quite correct- though the UK Met Office has a specific non-commercial API in DataPoint which is pretty good.


Oh, didn't knew about that despite checking out their website, but reviewing the licence you're correct in saying that commercial use requires payment.


I wonder if this isn't the same all over? In Sweden we also have a local metrology bureu which produces much better forecasts than most others. As I understand it they basically tweak the global models with .. sort of local "knowledge". And there are many such models, a few EU-specific and, I assume, one local for most countries.


Nate Silver's The Signal and the Noise's chapter about weather forecasting described forecasters during a visit doing that, a manual tweak essentially based on experienced intuition at the end that the models couldn't yet replicate.


This makes total sense for where I live too. A crowdsourced api of these adjustments would be neat.


About 10 years ago, I dropped into our local meteorology office on my lunch break to ask about the weather for a week long bush walk I had planned. It was amazing too see how much local knowledge the guy there could apply to the official BOM forecasts to make them more meaningful! Models are great, and have improved significantly since then, but human experience is also amazing!


Is metrology a typo, or are there countries that call it metrology instead of meteorology?

In the US metrology is a specific field dealing with measurement, often the department dealing with high precision calibrating/measuring tools in the metal machining industry.


> In Sweden we also have a local metrology bureu which produces much better forecasts than most others

Ironically Sweden’s best weather forecast comes from Norway [0]

[0] https://news.ycombinator.com/item?id=31965812


> Any recommendations? I personally can’t stand the actual BoM app as an iPhone user but I appreciate that they tried…

Have you tried the official iOS BOM app in the past few weeks? I recently switched to it from Willy Weather as it has improved a lot. It's display of the next 90mins of modelled rain radar is the Official App's killer feature for me.

Last week it enabled me to drive 40mins to some mountain bike tracks, knowing rain models showed that the massive storm would have just passed by the time I'd get there.


I just bookmark the BOM forecast web page for my location. Does an app give anything better?


The latest version of the official app has 90mins of future "modeled" rain radar. I couldn't find that quickly on the website.


For my own projects I end up using the WillyWeather APIs as they have a generous free tier and use BOM data - but yeah absolutely echo your complaints, I've seen Google up to 4 degrees off BOM/ my own weather station.


I can’t remember the name but there was a weather feed that was a pretty super aggregate of many feeds and it allowed for a higher level of accuracy to say take different individual values or blend as you might need.

This is a timely post and comment as I’ve been thinking of solving a weather problem I have with that feed. Hoping the hn hive brain might have seen the same.


I was really hoping https://merrysky.net/ had a graphical forecast similar to weather underground's 10 day. I've yet to find anything else that quickly shows everything you'd want to know in a single image. Even the mouseover timeline is perfection. It is so good and seemingly exclusive to WU that I sometimes wonder if they hold a patent for it.


The _presentation_ is impressive, but the forecast is often the raw output from various models, and thus not entirely reliable. For example, I live in a valley where most forecast models smooth over giving more snow/cold than we ever get.


I have a similar issue living on a hill with the windy.com forecasts. Have you had success finding any forecast source/app that’s accurate for your microclimate? Everyone always points to forecastadvisor.com but that’s always pulling from some airport 20 miles away for me totally irrelevant.


Related tangent: Windy (iOS app) does a remarkably good job providing multiple data sources and terrific dataviz.


Is it the app named "Windy.com", "Windy.app", or something else?

Edit: I suspect it's the Windy.app one. My initial impression is it looks quite nice. Available on both iOS and Android.


yes, windy.app


ubiquitous among boaters (incl myself)


also on Android. Fantastic for tracking Hurricanes


I agree with you, but I've also found the https://windy.com timeline along the bottom to be similarly informative, with ECMWF data to boot.


WeatherStrip does this beautifully. https://www.weatherstrip.app/ Weathergraph is also nice, and has a new optional Dark Sky skin. https://weathergraph.app/


"... similar to weather underground's 10 day ..."

Came here, like I come to every weather API/tool discussion, to ask the same thing ... I would really like a less spammy, less bloated 10-day forecast a la WU.

This comes close, however - there's a nice 7-day lookahead ... is it possible that WU is just fudging days 8-9-10 and no real data is available beyond 7 days ?


Persistence and linear regression are very common methods used to extend forecasts out in computationally cheap ways. Most forecast models have really awful validation statistics after about 48-60 hours out—depending on initial conditions, location, and a few other factors—so in some sense the forecast after about 3 days out isn't ever going to be very good so it's perfectly valid to use those methods. I would not be at all surprised if that's what Weather Underground does.

Another method that's occasionally used is to just fill in with TMY (Typical Meteorological Year) data. Lots of those data sets are freely available, or if not, are very inexpensive to calculate if station data is available.

If you're looking for a minimally spammy, information dense forecast and you're in the US, it's pretty hard to beat weather.gov. (And make sure to occasionally read the zone and regional forecast discussion texts, too. They're really interesting and often educational!)


Ten years ago beyond a week forecasts underperformed long term climatic averages, not sure if that has changed yet.



On Android, there is an awesome app called Flowx. Probably the best weather app I have used.


Thanks for the recommendation, I'll check out Flowx.

A while back I installed quite a few weather apps on Android, at the time, I felt wX and QuickWeather (from F-Droid) were the best. Maybe Flowx will be even better :-)


Hi, I am making an iOS app that could make you happy:

Link: https://weathergraph.app

Screenshots: https://impresskit.net/6430c7f0-b34b-418f-9824-f386f939be9a/...

What do you think?


You can also try the Glance Weather widget on Android.


I found this old blog article at a high level on how Dark Sky works... about 12 years old now. I always loved this app and was kind of sad when Apple acquired them... but to have Apple come in and just buy your app out is probably a great feeling for the founders :)

Anyhow, here's the url:

https://jackadam.github.io/2011/how-dark-sky-works


This is fascinating, thank you for sharing! What is even more amazing about Dark Sky is that they did this in 2011, when a lot of the tools I have access to now didn't exist. For Pirate Weather, I avoid working with radar data at all, instead using the 15-minute forecast that the HRRR model provides


Why avoid radar data?


I really like this free Forecase API too https://developer.yr.no/


One of the things that I've found on yr.no that I haven't seen on any other weather app is the classification of cloud level.

For example: https://www.yr.no/en/details/table/2-4407066/United%20States...

You will see not only the cloud cover as an overall percentage, but also the different levels. A 50 in the middle is very different than 50 in low in terms of what you can expect that day (for photographs).


How does this differ from NOAA weather.gov api?

https://www.weather.gov/documentation/services-web-api

Or is this a friendlier overlay of their interface?


Good question! It's the same underlying models, but three key differences: 1. Pirate Weather returns data following the Dark Sky syntax, as opposed to the custom NWS format. 2. Pirate Weather has global coverage, the NWS API is only for the US. 3. The NWS one uses human forecasters to come up with their forecasts, compared to Pirate Weather's use of raw model results.


Only on HN would the NWS format be described as 'custom'.


Only in the US would you assume your govt's format is 'default'.


I think the implication was more that any government meteorological office's format is the default over a small private company.

Especially since this API is repackaging output from the NWS's GFS model output.


Yeah. And the follow up snark comment about it being an American thing. It is literally the file format specified for this type of data, globally. As outlined by the World Meteorological Organization (WMO) decades ago. It's widely supported and parseable. Making a web API that parses GRIB data is trivial. It's not some esoteric thing. You can open the files in python on any modern computer+OS.


So what is different or better in the Dark Sky syntax?


I know literally nothing about weather data standardization, but if the format was — for the sake of argument — unique to the US for example and another more popular format was used either internationally or de-facto by third parties, I’d say it qualifies as a “custom” format.


Quite the opposite. It's literally the global format that nearly everyone in the world provides data in.


Under “Weather Sources”

> All weather data comes from the AWS open data program https://registry.opendata.aws/collab/noaa/. This is a fantastic program, since it is more reliable than the NOAA distribution, and means there are no data transfer changes!


A friendly overlay is definitely valuable. The NOAA APIs are super confusing and badly documented.


It's going to be a tangential comment but I work in science research that's adjacent to weather forecasting and I find the political/technical jockeying that is happening with forecasting to be fascinating. It's a nexus of capitalism, federal government spending, politics, and technology that has very real implications for individual Americans.

In summary: horrible oversight by the federal govt (read, congress) of our technical/scientific forecasting resources means that our forecasting ability is extremely fragmented and poorly organized. This has lead to a lot of companies being essentially resellers of public data. These companies claim to create a lot of value added products ('cleaner APIs', 'minutecasts', etc etc) that are either scientifically dubious or technically simple and then these companies walk away with huge profits based on being a portal to government data.

It's so American it is almost laughable, all while the European ECMWF eats our lunch in terms of accuracy even for the CONUS. I've discussed this on technical internet forums often enough that I can practically already write the replies to my own comment. "What's the problem with that?" etc et al. But the reality of it is that it's emblematic of how politically broken the US is, in particular with regards to the agencies in charge of scientific products and funding. Not to mention the concrete problems with the forecast products themselves.

Anyway. Good luck pirate weather and godspeed. Information was meant to be free and open, especially the forecast. It's such a laughably simple problem that could/should be so easily solved but, alas, there is money to be made!


100%.

There is enormous data available at https://www.weather.gov/ at no charge, including hourly and weekly forecasts, spot forecasts, radar (multiple layers, with/without animation) and satellite (multiple layers, with/without animation), plus storm watches, hurricane info, historical data, climate data…

I guess it's nice that apps can do things like advise me that it might start raining in a few minutes, but often by the time I see those alerts, the water on my head has alerted me anyway.

All other weather apps, it seems to me, are for little more than tracking my location and serving me relevant ads.


The most accurate and concise weather forecasts come from weather.gov, blows my mind that people don't know this.


As one who wrote (over 10 years ago, still improving it) and sells a weather app targeted at storm chasers, weather enthusiasts and spotters... I find the NWS data to be fragmented, often based on ancient computing standards (look at how NEXRAD data is formatted - it's binary for a 16-bit computer), and very hard to find documentation on.

Companies can add value by providing documented, consistent API's for data that, yes, is free from the government. NWS does not face the market incentives of a company, and it shows.

And yes, a bunch of companies take that data and hype it, but that's not particularly new - it precedes apps. I've long seen claims of forecast accuracy from private companies that are, well, absurd, given the limits chaos (and other issues) place on forecasting very far into the future.

The Big Data Project is a substantial improvement in terms of access, but the data itself is still in the legacy formats. Also, some data is not well suited for mobile access - it's in giant binary blobs (NEXRAD Level II) or requires multiple HTTP operations to acquire.

But... at least it's available and free (except for lighting).

I'm not talking about model data - I let others worry about that, and for personal use, I use sites like the excellent one from College of DuPage ( https://weather.cod.edu/forecast/ ). I've watched friends in the research and operational community describe their frustration at the decision process that went into GFS modernization, and how it was frustrating to see ECMWF beat it out in forecast skill (I'm not up to date on where that stands now).


As the dev who set up the Pirate Weather source, I couldn't agree more. 90% of the challenge here was getting the grib files into a format I could quickly and easily access, since the provided format is about as difficult as it gets. I don't pretend to know more about the actual art of forecasting than NOAA, but wanted a better way to get the data they produce out there


Have any nerdy Congresspeople wanted to solve this ridiculous inefficiency in recent years. Seems like a natural "the Europeans kick our ass at forecasting" message would go over well and potentially negate the lobbying from Weather Channel and others with a vested interest in perpetuating this stupid situation.


Yes, Congress passed the Weather Research and Forecasting Innovation Act back in 2017 which not only gave massive funding injection to model development activities, data procurement, and more, but also formalized the new next-generation community modeling initiatives which are focused on tackling the dominance of the EC models in the 2-5 day forecast space.

The diagnosis of the problems of the American forecast modeling community here is based on flawed premises. There are three major factors which led to the ECMWF leap-frogging the US in day-ahead forecasting capability. The first is the consolidated investment in supercomputing resources; the WRFIA tackles this by earmarking a much larger allocation of funding for NOAA's next gen supercomputer, but this still pales in comparison to ECMWF investments.

The second factor is the fragmentation of the research and operational weather modeling communities due to the divergent investment from NOAA and USAF in the 90's and 2000's; USAF in conjunction with NCAR sponsored the development of the WRF model which was widely adopted by the research community. NOAA continued investing the GFS lineage of models. The bifurcation of these communities slowed down the ability to matriculate advances in model developments to operations, and this was exacerbated by an old, closed-off approach by NOAA which made it extraordinarily difficult to run and develop the GFS on anything other than NOAA hardware.

Finally, the ECMWF went all-in on 4DVAR data assimilation in the late 90's, whereas the American community pursued a diversity of other approaches ranging from 4DVAR to ensemble Kalman filters. 4DVAR necessitates advances to core weather model software (e.g. you need to write a model's adjoint or its tangent linear in order to actually use 4DVAR) adn the US' failure to adopt it led, imo to a "double edged sword" effect of (a) failing to provide impetus to greatly improve the US modeling software suite and supporting tools, and (b) being a worse assimilation technique unless advanced EnsKF techniques are employed using very large ensembles of models (expensive).

The other problem as others have pointed out is that there is no accountability in the US private sector weather market. Virtually every player is re-transmitting raw model guidance or statistical post-processed forecasts using DiCast, _maybe_ with a some manual tweaking of forecast fields by humans. But this is not transparent, and many companies - if we're being charitable, here - are not honest about what they're actually doing to produce their forecasts. Put another way - there's a lot of BS claims out there, and it seems that investors have been more than happy to fund it over the past few years.


I disagree that my (extremely broad) diagnosis of the problems with forecasting in America is based on flawed premises. You've provided thorough, correct, and important details here but my comment was aimed at the broader HN audience, not on writing the central argument for a discussion on the historical timeline. I think what I said, "Horrible oversight by the federal govt (read, congress) of our technical/scientific forecasting resources means that our forecasting ability is extremely fragmented and poorly organized." is a very concise summary of the things you've laid out here.

As for the details in your comments, the only thing I disagree with strongly is the comment about investment in computing. While sufficient computing resources are central to good forecasting, the lack of investment by NOAA in computing (I sit at NOAA) is a red herring. ECMWF is significantly better than either of the two available American forecasts because they are just better at what they do, all around. In particular with respect to data assimilation. I've sat at meetings with ECMWF forecasters who have asked for access to my in-situ data products and their pipeline is as simple as "point us to the data please". Their data assimilation pipeline is so much more sophisticated and thorough that catching up on that alone would be a huge huge leap. Mind you, not just '4DVAR' the methodology, but literally the way that the community finds and integrates observational data.

ECMWF, the organization, is quite literally structured to strictly accomplish the goal of 'improve the forecast'. Whereas, again broadly, the American institutions are much more a congolemerate of associated researchers doing individual science projects while small teams work on specific segments of the forecast. Yes, we are attempting to fix this. No, we haven't fixed it yet.

This is not to say I don't think we should fund computing or that computing won't help. But we are quite literally 5-10 years of research behind on multiple fronts.


The thing about the computing is that it has impacted the culture surrounding NWP model development within the American modeling community. At ECMWF, there is capacity in the primary systems to support R&D, so the total cost to the community to maintain this capability is much lower than in the US where everything is fragmented. If there was greater capacity for researchers to run GFS on the limited number of systems with first-class (or any) support, it may have helped consolidate the community.

Totally acknowledge that there are other takes here. And I have a bit of skepticism about how much EPIC will really achieve and what it can do to resolve these issues. But I don't necessarily agree that the science at EC is 5-10 years ahead of the American community's. What's matriculated R2O is definitely a few years ahead, of us, especially for medium-range forecasting. But the US likely still maintains a competitive edge in mesoscale/rapid refresh forecasting, and even though we've lost key developers to the private sector recently, the NBM seems (in my admittedly limited experience) to perform favorably to similar products out of ECMWF or other European national weather services.

Your point about ECMWF being fundamentally structured with the singular goal of improving the forecast is super important - I 100% agree with that, and the US has yet to do much of anything to address this.


Extremely valid points. Thanks for sharing your perspective, counters. It's much appreciated. One thing I think is fascinating, every time this comes up on a place like HN, is how detached the conversation often is from the meat-and-potatoes of forecasting. Which is to say I've seen many a googler think that weather forecasting is a simple problem and handwave the discussion away with "throw compute at the problem". It's always great when there are people around to ground the discussion.

Last point to the specifics. You're very right that the American teams have nailed mesoscale/rr forecasting. Which is, if we wanted to really divert the discussion, interesting and arguably more advanced because of its industrial applications (wind farms, etc et al).

I think your most salient point is about how the extra compute resources foster a culture of improvement on the models themselves. I am on the compute task team and it's something I argue for every day. Are you a researcher? If so, send me a message. I used to have my NOAA email in my profile but it would probably be neat to connect professionally.


As someone living in continental Europe there’s one thing I haven’t found anything remotely close to NHC quality forecasts. Curious what you think of that institution?


A thousand times this!


If you asked me 25 years ago, "Wanna bet hackers will be excited about a weather API in the future?" ... Sink me ... I would have lost that bet.


Same here, what are the use cases that fueled this demand?


The tech diagram on AWS is insane. This is what I call vendor lock-in porn

I'm wondering how much your monthly bill is?


This is like the 5th iteration of that darn diagram, and I still can't quite get it right. The other comments are correct, the underlying setup isn't that complicated, and it wouldn't be too tricky to build it for another cloud, since all that's really happening is a Python script reading a NetCDF file. The real perk to building everything using a server-less approach was that everything scaled down to 1 user (me), and then has scaled up beautifully as more people have signed up.

The monthly bill isn't great, but has been manageable through donations so far. I'm handling about 10 million API calls/ month, so lots of Lambda invocations, but they're all thankfully very short with tiny outbound data transfer totals, and luckily inbound is free. If the costs ever get out of hand is to throttle the request rate down, but there's still a few optimizations I can do on the AWS side that should help (ARM here I come!).


Want me to take a gander at it? This is a project I’d love to see thrive.


It's rather simple, just messy to look at. It's a timer that triggers a cluster that processes raw data files and uploads the JSON to a server with a reliable file system. Then a Lambda function processes API requests to read that JSON and serve a response.

"Lock-in" is just a trade-off like any other engineering decision, same as what programming language or API schema you choose. AWS is massive and reliable, and these services are cheap and widely available so I don't see much of an issue here.


Which part is insane?

Put another way, how would you implement this without self-managing pieces like nginx/apache, rabbitmq/kafka, or mongodb on a compute instance?

The pieces of managed infra they're depending on are all pretty interchangeable from one cloud provider to another. It's not the cheapest way to solve the problem if you discount the cost of management, upgrades, etc. But if you factor those costs in, it's quite competitive.


I would go to cloud providers, which can offer me managed rabbitmq, kafka or mongodb

Certainly, I wouldn't get any vendor-specific services like Lambda or SNS (in this example)


The Lambda functions will just be running your business logic code. Only specific part Lambda provides is the glue. Which is easy to replace with any other cloud or self hosted alternative.


The the end of the day, the real reason I went with Lambda to build this is that when I started, I didn't know anything about spinning up a server, building a storage array, or serving an web request. But what I did know was how to write a Python script to extract weather data from a file. The really cool thing about some of these cloud tools is that they let me ignore all of those (very difficult) problems and just focus on the data


Exactly what the cloud is best used for imho. You can focus on your core problems and still shift to a different solution later on when that becomes more effective cost wise.

But still, when I first started in AWS we had lift and shift'ed most of our platform onto AWS and when it came to adding new features I first looked at Lambda instead of building out on the existing platform. The speed at which I could setup the feature and start measuring and iterating on it was so much better compared to doing it on our existing platform. Even though that was pretty ironed out and fully automated.


Where is your lamda_handler defined? I searched your repos but could not find it.


Easy enough to move to other clouds if it was defined in something like terraform or even cloudformation. Could even move out of the cloud to your own bare metal services pretty easily, since the various components are so well defined. AWS makes the most sense though as stated in the docs because a lot of the data being processed is in S3, and reading data from S3 inside of most AWS services is free.


This is a pretty standard looking diagram. It probably isn't very expensive to run.

AWS diagrams are pretty intimidating until you've built a few things with several AWS services.

The benefit is that a lot of maintenance work is taken care of for you, and your costs can be low if you don't need a lot of compute.


I am an Infrastructure Architect (aka "Cloud Architect") so I design cloud systems like this on the daily. The "vendor lock-in" argument always makes me laugh. Its the #1 thing I hear all day long.

This diagram is actually pretty simple. It looks worse than it is. All it uses are Lambdas (serverless functions), S3 buckets (object storage), and SNS (broadcast/push queues). There appears to be one traditional server in there with EFS, which is just an elastic file system.

All of these systems have equivalents in all the major cloud providers. So if the builder of this wanted to move to GCP or Azure, they are not really locked to AWS. This can all be built in another cloud.

Now, could you do it in a day? No. Assuming they are building it with Infrastructure as Code (such as Terraform) then they would need to convert the provider and change resource blocks. But this akin to refactoring in a codebase. Its work, but its not terribly difficult. Then they point it to their new cloud and run `terraform apply`.

There is almost no way to entirely remove vendor lock-in. The closest you could come is by designing everything yourself on bare metal servers and renting those from a cloud provider. So instead of using a managed queue system, you run some sort of messaging queue on the server. Then you host files on the server's filesystem, and you run the "lambdas" as applications on the server. But that almost causes more headaches than you save or solve for.

I look at Cloud Providers as similar to cell phone providers. I know people who live in fear of being locked into a contract with Verizon or something. But really, what are you going to do? You will always need a cell phone. The only other real choice is AT&T or maybe Sprint/TMobile. How often are you really going to switch and what are you really gaining by doing so? Energy spent worrying about being "locked in" to a cloud vendor is energy wasted. Yeah you can move from AWS to Azure or GCP. But that's about it. What do you gain by switching? Probably almost nothing. They are all pretty comparable at this point in reliability, features, and price (GCP is the slight laggard here, but not by much). If Google calls your company and offers you a huge discount to switch, you could still do it. Aside from that, there's minimal incentive to do so.

There are a few weird services that AWS has for example that might be considered "lock-in" services. This would be things like AWS Snowball or AWS Groundstation. These don't have comparable systems on other platforms. In the case of Snowball you probably have so much data on AWS that just transferring data would take months (or even years) which could be considered a form of lock-in.

tl;dr - This is a very tame arch diagram. A few lambdas, s3 buckets, and messaging queues, all of which have comparable services on all major clouds. There isn't significant vendor lock-in, this could be rebuilt fairly easily (assuming they used IaC) on any major cloud provider.


Hello

> This diagram is actually pretty simple

The diagram looks like an ad

> All it uses are Lambdas (serverless functions), S3 buckets (object storage), and SNS (broadcast/push queues)

Do you actually need all of this or do you use it because Amazon tells you to? I know for instance you cannot use Amazon SES without also using S3 and Lambda

> So if the builder of this wanted to move to GCP or Azure, they are not really locked to AWS. This can all be built in another cloud

You're saying that I cannot move to other cloud provider without my existing code becoming useless?

> Assuming they are building it with Infrastructure as Code (such as Terraform) then they would need to convert the provider and change resource blocks

What about the data pipelines and business logic?

> There is almost no way to entirely remove vendor lock-in

There is: avoiding vendor-specific APIs altogether

> Closest you could come is by designing everything yourself on bare metal servers and renting those from a cloud provider

I don't have to. There are things like Railway, Fly.io, PlanetScale, Supabase, Upstash, Minio, which can work without locking me in

> What do you gain by switching?

Freedom

> There isn't significant vendor lock-in, this could be rebuilt fairly easily (assuming they used IaC) on any major cloud provider

You are contradicting yourself


Also a cloud engineer. I use a similar setup professionally and for various personal projects.

For someone who isn't familiar with standing up cloud resources the diagram can look overwhelming but once you play around with AWS for a bit, most of the resources you see are fairly boilerplate.

VPC is essentially just setting up a virtual LAN for your resources. S3 is being used as an API mediator to NOAA. CloudFront is a CDN for your API Gateway. Lambdas run your logic. API Gateway triages api requests to lambdas, and a couple other services act as mediators.

There is some vender lock-in in that everything here is built on AWS, but all the major cloud providers have similar/equivalent services. If you decided to move to GCP or Azure you could probably move the entire stack in a few days (maybe more depending on how much your lambdas use AWS specific internal APIs).

If vendor lock-in is a really big concern for you, you can run everything on an EC2 instance running Ubuntu instead. That way you can just spin up your service on another Ubuntu instance in another datacenter, or locally, or whatever.

Soooo, yes. There is some vendor lock-in here, but not much.

To answer your cost question. I run a very similar setup for personal projects and I rarely exceed AWS's free tier for most services. On a high-usage month it's around $85. It isn't super optimized. I could make it cheaper and nearly free if I put in the work.

That said, cost for a service like this scales very proportionally to usage. For example, AWS API Gateway can process 1 million requests for free before they start asking you for money. If the service becomes super popular we'd likely see the "Buy me a coffee" button promoted a little more and eventually you may see a paid tier as an option, but as it is, it's probably pretty affordable to run.


I'm the dev behind this, and really appreciate all the insight from actual cloud professionals! Your guess here is spot on, I designed it so that I could more or less fit in the free tier with exactly one user, with costs scaling pretty linearly afterwards. There are a few more optimizations I could do, but it's honestly pretty impressive how much traffic (I'm at about 10 million calls/ month) can be (somewhat) cheaply handled using AWS


> "I know for instance you cannot use Amazon SES without also using S3 and Lambda"

You can absolutely use SES without S3 and Lambda. I've used it many times in various projects.



As that page makes clear, SES can hand off incoming mail to a Lambda, or to S3 – or to SNS which can deliver it to any HTTP endpoint, or e-mail address, or text it to your phone for that matter.


What do you expect SES to do with your mail after receiving it? S3 and Lambda are optional delivery locations, amongst other choices.


Exactly this. It receives the email, now what? You need to run some code on it and so the way to do that is one of the compute services. AWS isn't forcing you to do anything here.

99.9% of SES users I promise are only sending mail anyway. You aren't forced to have Lambdas or anything else to send mail.


Wow, I struck a nerve. I'm happy to address these points however.

> The diagram looks like an ad

Lol, Its an architecture diagram. You could swap the AWS-specific icons for generic ones I suppose, and it wouldn't change anything. It is fulfilling its purpose of explaining how all the services connect together to deliver the product. Just because it is an AWS Lambda icon doesn't mean you couldn't make it an Azure Function icon instead and perform the same goal. You're just too focused on hating AWS here to see the forest through the trees.

> Do you actually need all of this or do you use it because Amazon tells you to? I know for instance you cannot use Amazon SES without also using S3 and Lambda

This is just a standard event-driven architecture. There's really nothing exciting to see here. Data comes in, it gets stored somewhere (S3), that triggers an event (SNS), a compute service (lambda in this case, but could be anything, even a standard VM, bare metal or anything else) picks up the task and processes it or performs a job on the data and stores it, it triggers another event, something else picks it up, and so forth. This isn't an AWS design, its just an event driven architecture design and this is how they work.

SES can be used standalone. It doesn't require Lambda or S3 like you postulate. There are only a few times AWS requires something else and its usually Cloudwatch or S3 and these will sometimes be the destinations required for specific types of logging or auditing and so forth.

AWS is forcing you to do nothing here. The creator chose this stuff. But I assume they chose it to keep it free. Most of this will survive under the free tier until the project becomes massive. If it inches over the free tier, it will still be cheap. That's probably the incentive for a lot of this.

> You're saying that I cannot move to other cloud provider without my existing code becoming useless?

Correct your code is your code. Think about a lambda. In this scenario data comes from NOAA and is put in a bucket. You write a serverless function that takes that data out of the bucket, reformats the NOAA data into your proprietary format and puts it in another bucket. The code that does that is written in Go, Python, C++, Java, or whatever you want. If written correctly, it accepts data, processes it, and outputs it. So if you move to another cloud provider your code would still work. It might run in an Azure Function instead of an AWS Lambda, but that doesn't matter. Your code does the same thing you don't need to throw it out.

> What about the data pipelines and business logic?

Hard to say without more information. But its possible BI dashboards need to be changed to point to the new service and stuff like that. Sure. Again, you're not switching clouds in an afternoon. But the point its the next cloud system is eerily similar. Its not like you have to rebuild, it would be more of a refactor.

> There is: avoiding vendor-specific APIs altogether

Possibly. But if built correctly your code doesn't need to be aware of the environment it is in. But there might be cases where you interact with the services directly, like downloading from S3. This would change with the next provider possibly (although most actually have S3-compatible APIs). But most of your application will not directly interact with the cloud, it will interact with the services. So for example you use RDS to host a managed Postgres db, but your application is just interacting with postgres, not AWS here. But you're right there might be some scenarios that use vendor-specific APIs.

> I don't have to. There are things like Railway, Fly.io, PlanetScale, Supabase, Upstash, Minio, which can work without locking me in

I fail to see how these are any different than tying yourself to a product like S3 or Lambda. In many ways, these solutions are TRUE vendor lock-in, with all the vendor specific APIs that you live in fear of. Fly.io is a PaaS, which is going to be way harder to move away from than switching from AWS to Azure. PlanetScale, Minio, and Upstash are literally no different than equivalent products in AWS/GCP/Azure. I guess you could host the instance that runs these products ondifferent clouds and it would be the same, but you're still tying yourself to something. The risk of tying yourself to a startup is higher than tying yourself to Amazon/Microsoft/Google. You're trading one evil for another, in most ways you are actually losing freedom with these not gaining it.

> You are contradicting yourself

My point is that there isn't as much vendor lock-in that people fear. Yes it exists, but don't live your life in fear of it. Yes you would need to refactor stuff here and there. But the same architecture diagram we saw for AWS is basically the same one that would exist in Azure or GCP. The underlying tools don't change. The marketing names and logos change, which clearly bothers you, but the underlying system doesn't change.


You clearly know way more about cloud implementations than I do, so I really appreciate the time you took to explain that out! Since I am the dev here though, the one things I can confirm is that you're 100% correct about the setup methodology- almost every decision was based on "how can I do this in a way that's cost effective". In particular, the underlying data was already on AWS, so it just made sense to build it there.

I think one thing that gets lost in discussion is the advantages of serverless approaches for people without a ton of technical background. I built 90% of this without knowing anything about servers or APIs, but the cloud tools (from whoever) let me ignore all of that and just write a bunch of Python scripts that do cool things. I know it ends up sounding a bit like an AWS ad (I wish I was sponsored by them, but am not), but there really are perks to the approach


I’ve used https://open-meteo.com/ before and I think it’s the same type of open data being exposed.

These types of projects are great for stuff like home automation. I’m using to to improve my predictions for power generation (PV) and consumption (heat pump). Planning to is ergst to optimize home battery charging in the future.

(Disclaimer; open sourced a small go library for open meteo, but otherwise not affiliated)


Hi, creator of open-meteo.com here! I am using a more wide range of weather models to better cover Europe, Northern Africa and Asia. North America is covered as well with GFS+HRRR and even weather models from the Canadian weather service.

In contrast to pirate weather, I am using compressed local files to more easily run API nodes, without getting a huge AWS bill. Compression is especially important for large historical weather datasets like ERA5 or the 10 km version ERA5-Land.

Let me know if you have any questions!


open-meteo.com looks awesome. I've been messing around writing a snow forecast app for skiing/snowboarding for a while now and the main thing I'm missing is historical snowfall data. Do these data sources exist in a machine readable format and I've just not been able to find them? If so, would you ever consider adding precip + kind of precip to your historical API?


Snowfall is already available in the historical weather API. Because the resolution is fairly limited for long term weather reanalysis data, snow analysis for single mountains slopes/peaks may not be that accurate.

If you only want to analyse the weeks to get the date of last snowfall and how much power might be there, use the forecast API and the "past_days" parameter to get a continuous time-series of past high-resolution weather forecasts.


I've done exactly that! https://www.youtube.com/watch?v=oFSQYK20YUU&list=PLT7ckgz8vc...

I've got a bunch of moving parts in my system, including realtime (5-minutely) energy pricing. If it looks like it's going to be cloudy tomorrow I put my thumb on the scales to make it more likely that my system will buy power from the grid to top off the battery so I can ride through any price spikes.

I don't have the stats chops to determine whether I'm actually saving any money with this approach, but it sure is a lot of fun.


I usually use forecase.weather.gov They provide a nice textual weather page, with English, for next 7 or 8 days [1]. But it is tiring to open 3 or 4 bookmarks, so I thought to consume that as json & make a single page html app with ajax requests. Its still in progress, I spent few minutes yesterday. The surprising thing was (I was ready to fetch the html text & parse it, difficult), I simply changed one parameter to JSON & I got json [2]. I didn't find any easy to find documentation about json endpoint. My plan is to have city abbreviations on top in horizontal menu, then each click fetches json, puts it into div below.

1. https://forecast.weather.gov/MapClick.php?lat=37.78&lon=-122...

2. https://forecast.weather.gov/MapClick.php?lat=37.98&lon=-120...


That is cool the web front end exposes itself as JSON as well, good tip.

FWIW, they also have a pretty decent API. Its based around zones though, which you'd need to look up. So from that lat and lon, you'd get the zone from:

https://api.weather.gov/points/37.7827,-120.38

Using the zone information, you can get to a forecast:

https://api.weather.gov/gridpoints/STO/72,24/forecast

If you're wanting the current observations, you'd pick a station for that grid such as MOUC1 and go to its observations/latest endpoint:

https://api.weather.gov/stations/MOUC1/observations/latest

The API is in a JSON-LD format so its got a lot of links to related topics in the actual JSON payload. Looking at the JSON can make it somewhat easy to feel out what you need. The documentation is here:

https://www.weather.gov/documentation/services-web-api


Thanks, on that day I already was playing with the api.

Initially I had few bookmarks saved on my android home screen, for each city I was interested in weather forecast of. But soon there were total 4 cities; & their text size is too small, need to zoom. I slurped their json api & wrote this:

https://spa.bydav.in/weather/

You need to add URL for each city you want to see weather, in Settings. All that data is saved in your device local storage. No funny trackers or home phone.

You can also paste the following json in Settings > Text Area > Click Import JSON:

  [
    {
      "abb": "STK",
      "full": "Stockton",
      "url": "https://forecast.weather.gov/MapClick.php?lat=37.95&lon=-121.29&unit=0&lg=english&FcstType=json&TextType=1"
    },
    {
      "abb": "SFO",
      "full": "San Francisco",
      "url": "https://forecast.weather.gov/MapClick.php?lat=37.7771&lon=-122.4197&unit=0&lg=english&FcstType=json&TextType=1"
    }
  ]


Related discussion about Merry Sky, a Dark Sky replacement built on this

https://news.ycombinator.com/item?id=34155191


Is there a linguistic shift happening here where this is no longer "a service implementing the same API as Dark Sky", rather it is "an API"


https://pirateweather.net/apis isn't loading for me


Same here. Tried a few hours apart. If you're looking for the API docs, as I was, they are here: https://docs.pirateweather.net/en/latest/


This is a limitation of how the Developer Portal is set up, you have to be registered to view the APIs. Like the other poster said, if you're looking for general info, go to the docs section, since all the information is there


This is great. Can endorse LuckGrib for IoS folks as well. Great UI and can pull NOAA and EU models. https://luckgrib.com/. Not FOSS but worth the $20 I paid once several years ago. Popular among wind sports enthusiasts many of who are hobbyist meteorologists.


I cannot seem to sign up, it says "Exceeded daily email limit."


Confirmed. Me, too.


Awesome, but it's not yet got the text summary feature, which curiously is also missing from Apple's official Dark Sky replacement, WeatherKit:

https://developer.apple.com/documentation/weatherkitrestapi

I created the start of a Python wrapper for WeatherKit, if anyone is interested in helping with that effort:

https://github.com/davecom/PyWeatherKit


One of Dark Sky's features was integrating hyperlocal weather stations.

Right now there are thousands of weather stations posting regular reports via APRS.

Any chance to see those integrated?


They (sort of) are! It gets into the nuance of the weather model side of things, which I don't know as much about, but the HRRR/ GFS models ingest data from APRS and incorporate that into their forecasts. This means that these observations are included (especially in the US), but are at about a 2 hour lag compared to direct readings


This is really cool! Can't wait to try. But I'm getting: "Exceeded daily email limit." during sign-up.


I cannot load the apis page right now. Is this just current weather data or do you offer historical weather data too?


Tried it indirectly via MerrySky after it was mentioned here the other day.

I am in Europe it was completely off in both its 24-hour forecast and the actual real-time weather. It indicated a continuous heavy snowfall whereby in reality the sky was just lightly clouded with no precipitation. Just 2c.


I applaud the effort. I just checked the forecast data for Berlin, Germany and it is quite inaccurate compared to the quality models offered for my region.

It seems that they only use NOAA data even though there are vastly superior models for the EU, e.g. ICON-EU and ECMWF.


This is amazing. You're amazing.

Just bought you a coffee, everyone who upvoted this should chip in!


Thank you so much! I really want there to be a free level available, and it's donations like this that make it possible


Is there a decent free weather app for Android or iOS which doesn't have ads and / or unnecessary tracking?

Would be cool if Pirate Weather could serve as the foundation for such a thing.


There is also yr.no, it has many details and looks great


WeaWow is good. Donation-supported.


Wow, thanks for the quick reply and info, Tempest1981! Cheers.


I like having an open API. Every weather site eventually becomes trash as it is loaded with obnoxious ads. Maybe the answer it just to have numerous weather front ends.


It's a real issue, that's why I made my own! The beauty of an open source one is that if things start to go off the rails, someone else could also pick up and recreate the data source


Why are pirate systems the best? They just AARRRR!


This is awesome! If you want to use ReadMe for free for your docs, just email me (my email's in my profile) and I can hook ya up!


Is it a common startup idea to open source a successful product and change the payment to be donation based?


> An error was encountered with the requested page.

> Exceeded daily email limit.

Happens when I try to signup


what ive always wanted is a competent API for weather.gov, but theres just no impetus to get NOAA to drop their convoluted pseudo-XML gobbledygook.


If that makes you feel bad, take a look at Australia's Bureau of Meteorology. A disorganised mix of access methods, and went around breaking people's requests randomly with 403's and accusing them of screen-scraping even when accessing json. (Some getting hit with at just 1 request/day)

Emails to them resulted in replies asking to instead use their FTP.

Additionally they're one of the few Australian Gov agencies that still don't have proper https deployment yet.


appears to pretty much US only, unfortunately. it's -5 where i am right now, but this claims that it is actually -17 (a very big difference)


@OP you've exceeded your daily cognito limit




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: