Hacker News new | past | comments | ask | show | jobs | submit | mikeortman's comments login

I'm glad we are starting to lean into cloud-agnostic or building back the on-prem/dedicated systems again.

I think the very concept of this is to open source a common stack, instead of relying on a middleman like Porter, which also costs a TON of money at business tier

It baffles me why people would be so unnecessarily critical on someone's project they are showing off in a show-and-tell fashion. It's a neat project with value.


Where's the criticism?


I'm fairly sure weather.gov's api doesn't have an SLA of any kind. I highly recommend going through the push vs polling approach if possible and store the data as it comes in live in your own DB. AWS Open Data + MADIS are good sources that are... for reasons... far more stable

EDIT: sorry, I misread your commend. If it's only once every 30 secs, that may be a bit overkill for your needs. You may be able to get the appropriate text product directly and parse it out. Unfortunately probably the most stable, yet clunkiest, way


Hi, hobbyist here! This is a huge area where government meteorologists and "Big Weather" differ and you can help close that gap!

For context:

The governments of the world provides these big weather companies (weather.com (cough IBM), Accuweather (cough IBM cough), etc) a metric shit ton of their data completely for free (by law) including data transfer. These are things like radar, satellite, ground station data, forecasts, composite models, etc. These companies profit substantially on it, as in billions of dollars. You as citizens also can get this data completely for free as well! MADIS is a system the government is working on to make that data access easier by bringing together many of these systems together and removing the bureaucratic redundancy and abstracting out the aging infrastructure. This is literally terabytes of data per day you can grab with almost no questions asked. That data is then processed privately and resold and repackaged to the end user, and you probably interact with this privatized data the most.

The frustration I have much of the additional "value" these weather data brokers provide is by linking up with each other with data contracts. These private companies have a much much higher detail on the ground than the government by being able to partner with companies that make common internet-connected personal radar stations and reselling that data to each other. The government doesn't have that privilege to buy limitless data. NOAA/NWS, for example, is extremely underfunded so if they had to privilege to buy it they probably couldn't come to an agreement to buy it. As a result, they can't use that data to improve the accuracy of alerts/warnings/forcasts, the same exact tools that the big weather companies make all their money from. It's a shit cycle and totally unfair IMO.

So please contribute if you can!!

Sidebar: I'm a founder of a self-bootstrapped startup to build a better weather data broker that doesn't cost an arm and a leg. If that's something you are passionate about solving, feel free to reach out :)


Hi, I am building an Open-Source Weather API aggregating open-data weather models from NOAA, ECMWF, DWD, MeteoFrance, JMA, CMA, CMCC and others. I agree that many weather companies basically redistribute NWS data at a premium. There is a free API service available on https://open-meteo.com and all databases are redistributed via an AWS Open-Data Sponsorship. Feel free to reach out if you need help building your weather data broker startup


A sincere personal "thank you" from me! I use Breezy Weather[1] on my smartphone, which gets most of its data from your aggregating service, Open-Meteo. It's the perfect combination of accurate, international and open.

[1]: https://f-droid.org/en/packages/org.breezyweather/


First of all, thank you! What a treasure trove of data.

If I may ask a question, do you have historical air quality data?


You can access historical air quality data from August 2022 onwards: https://open-meteo.com/en/docs/air-quality-api#start_date=20.... Data is based on the Copernicus air quality forecast. All are references listed the documentation


Thank you!


I am working on this kind of solution - please email - see about


Thank you for doing this!

As my own sidebar, I spent many years at a national lab working with distributed sensor networks (primarily ATC and other radar for detecting non-weather stuff :) ). I thought about using ADS-B as input for weather state and forecasts, but never got around to trying it. Now that I am working on my own startup (self-funded and without revenue so far), so this again will likely languish in my todo list. If someone wants to try it, great, and feel free to reach out as I can probably save you some time selecting and interpreting the right ADS-B fields:

We have a lot of aircraft blasting ADS-B reports whenever they fly. Most reports contain (1) accurate 3D position, including altitude and (2) barometric altitude measurements, which gives you (after some minimal work) air pressure. So you have millions freely available pressure reports not just on the ground, but throughout 0-40000 ft altitude band.

You also get measured airspeed and groundspeed, so in addition to pressure you get wind vectors at thousands of points in the air, updating in real time. I suspect this can provide some non-trivial information and I am not aware of anyone actually using it for this purpose.


These data are already consumed operationally by the major global weather modeling centers - e.g., check out the AMDAR [1] or ACARS programs. There are commercial agreements which restrict third-party access to these data in real-time, but they are widely disseminate with a 24-48 hour lag in the observational archives that NOAA curates for weather modeling.

These data are a very important input for operational weather forecasting. On a global basis, we've seen how losing these data during the pandemic due to reductions in air travel decreased forecast model skill [2]. Furthermore, ACARS profiles derived from aircraft landing at airports where severe weather is expected are regularly used to complement SPECI weather balloon launches.

[1]: https://community.wmo.int/en/activity-areas/aircraft-based-o... [2]: https://journals.ametsoc.org/view/journals/apme/59/11/JAMC-D...


Thank you for the info, I am glad to hear that the data is being collected and used. Getting it from raw ADS-B reports makes all fields available without the lag (and thus more useful for weather reporting and forecasting), but the recorded data is definitely good to have!


you referenced "(1)" and "(2)" but didn't provide the actual links


Those numbers were a form of "verbal highlight" of items in a long sentence, not actual links.

For specific ADS-B fields, ADS-B exchange used to have a great data fields overview. ICAO should have a spec, too.


It was an inline numbered list of two items. [2]

[2] https://www.youtube.com/watch?v=UTby_e4-Rhg


Don't forget, AccuWeather wanted to make it illegal for you to have access to that data for free from the government.


Same thing in Germany, with WetterOnline ruminating data from DWD (the German meteorological service) and then suing them when they offered their DWD WarnWetter app for free. Unfortunately, they were successful: https://www.heise.de/news/BGH-Urteil-Staatlicher-Wetterdiens... (Couldn't find an English source, sorry.)


>Unfortunately, they were successful

Unbelievable.


Alas, there are yet ongoing attempts to break-up and/or privatize the NWS. Project 2025 specifically calls for the break-up of NOAA.


Oh for crying out loud! WTF???

I think someone spilled idiot juice in the water.


Swamp Water Kool-Aid for the cultists.


You can always use yr.no

This is the Norwegian government weather service. It's global and free for everyone. Also has fully open apis


I second this.

It is incredibly fast, no bloat, no ads.

Just not as accurate as a local service as their main focus is - norway.


Idk about that. It's used by farmers all over the world. They do purchase local data.


> weather.com (cough IBM)

Didn't IBM sell weather.com (and all of the assets) to a private equity? The deal was announced last year and closed in Feb 2024

https://finance.yahoo.com/news/francisco-partners-completes-...


Yes, they did. AccuWeather is also privately owned and a major competitor of TWCo. It has never had anything to do with IBM.


Yeah, they did... but being blunt, IBM had 8 years to do a lot of damage. Not speaking of the engineers or the tech behind it, I'm talking about escalating the ad revenue and general junk on their website. MOST of the page load time on weather.com and AccuWeather is ad and tracking. It's an ungodly number of requests and will drain a data plan's usage limits surprisingly quickly.

Just opening weather.com will send almost 1000 requests , transfer 10.3MB. Every 30 seconds or so it will make about 300 requests + 2MB of transfer for new ads. It's... insane


Ads and cookies are hardly the worst thing that AccuWeather has done. They have been lobbying the federal government for 30 years to forbid NOAA from issuing forecasts at all. They want the government to pay for all the satellites and supercomputers, and they want all that data for free, but they want you to have to pay AccuWeather for forecasts.


In situations like this I feel that it is important to put a face to a name.

AccuWeather, a corporation like any other is made of people. These shortsighted decisions and shitty behavior are directly attributable to people who do them for selfish reasons.

So let's start talking about these people in the first person, not in the abstract.

Who is directly responsible for these atrocious actions and why do they stand to benefit from them?


> So let's start talking about these people in the first person, not in the abstract.

Here are two: Rick Santorum and Jim Bridenstine.

https://www.washingtonpost.com/news/capital-weather-gang/wp/...


This same junk happens in Canada. Attempts to disrupt pages like this:

https://weather.gc.ca/city/pages/on-118_metric_e.html

and any info from Environment Canada, being packaged in a way that might compete, with the very same people using Environment Canada's free data for-profit.

Cries of "How can the Government compete with the private sector" are thrown around, always glossing over how that very data is often sourced from the Government. Pathetic. Leeches of the worst kind.


But who? Who specifically in Canada is making these cries?


I was at a wedding where the accu-weather guy was complaining the NFL was a monopoly because they wouldn't let Rush Limbaugh buy a team...


John Oliver had an episode[1] of Last Week Tonight which discussed some of this I believe.

[1] https://youtube.com/watch?v=qMGn9T37eR8


At the time they bought it, I joked that IBM bought the Weather Channel as part of its Cloud strategy. I stand by that claim.


> These private companies have a much much higher detail on the ground than the government by being able to partner with companies that make common internet-connected personal radar stations and reselling that data to each other.

I haven't heard of personal radar stations, and wasn't hitting anything in a quick web search. Are you able to provide an example of these systems?


I'm pretty sure this was a typo, so s/radar/weather/ .

For reference, a weather radar operates in Doppler mode with return signal coming from Rayleigh scattering of raindrops, so it's on the 3cm - 10cm wavelength. You are talking about something like a 5 meter diameter antenna dish that weighs half a tonne, which is on an elevation-azimuth motorized mount, in a 7 meter diameter radome, with peak transmit power of 250 000 W.

Of course you can buy one yourself, if you have the space, electrical power and money for it - ballpark 1.5 mill. USD.


I've worked on a weather radar system with specs suspiciously similar to those you are describing. 250kW is a pulse power for a C-band stationary radar system, with a typical pulse length around 1us repeated 500-1000 times a second it amounts to 1/1000 duty cycle and 250W average radiated power. These pulsing parameters give about 150-300km of usable range, return signal becomes too noisy on longer ranges anyway and geometry of beam propagation means that you're shooting into outer space above meteorologically interesting part of atmosphere. It doesn't use that much power from the grid either - datasheet specs around 5kW total for all the stuff (transmitter, motorized antenna pedestal and equipment rack with a pretty beefy server to chew all that data coming from the receiver in real time). The cost aspect is pretty much in the ballpark - I've once visited our test site and the engineer pointed on the antenna horn (about paint can sized chunk of metal hanging in front of the dish full of microwave RF magic ) and told that just that piece costs about the same as a new high-segment car.

C-band weather radars existed since forever, first using magnetron transmitters and now solid-state amplifiers, though there are still a lot of new magnetron-based systems being installed, with the downside that magnetron pulses are practically impossible to modulate to perform advanced radar techniques that improve different aspects performance. There are also X-band weather radars, which operate on higher frequencies and use more modestly sized antennas (but still larger than you'd like to have on your house roof). They are more limited in range (100km-ish max) due to high attenuation and mostly used at airports, offshore oil rigs and windfarms and similar installations that mostly interested in precise local weather. They are still several hundred thousand bucks.


I don't know how much this costs, but it's only a meter, 65 kg, and runs on a 110 volt outlet. I'm thinking high ones of thousands of dollars, maybe low tens of thousands.

https://www.furuno.com/files/Brochure/448/upload/WR110_EN.pd...


They only list agents and the agents don't list prices and few feature the doppler weather radar on ther website ..

That said, I'd guess $10,000 US < price < $50,000 US for that product given the pricing of $8K US for smaller commercial fishing radars from the same company.

More on the two compact weather radar systems: https://www.furuno.com/en/systems/meteorological-monitoring/


Probably another order of magnitude more expensive.


Maybe, maybe not.

There's another recent peer comment that confirms 1.5 million US for a large 250kW is a pulse power for a C-band stationary radar system with 300 km of range.

This is a small doppler with modest power and 70 km range max.

If cost is proportional to the 3D volume of space scanned (as the power requirements likely are) Then this small mini radar might well be less than 100K.

TBH I have no specific knowledge here although I have worked in other sensing domains and seen a wide spread on cost of equipment related to volume and quality of data.

I'd be interested in the flat cost price of the mini system, I suspect that's not going to appear without some inside knowledge or working a dealer, it appears to be a rare bespoke kinf of thing, not like the commercial fishing radars.


Ballpark, 80k to 150k for those Furuno systems in the USA depending on single or dual polarity, etc. I have taken the sales pitch.


Thanks for that, interesting and not unsuprising after factoring in commissions + profit.


Even if you buy one you will almost certainly need to get spectrum approval to transmit, which is non-trivial and comes with many strings.


https://www.meteopress.cz/meteopress-com/ These guys started to produce beefy radars order of magnitude cheaper


I plan on installing one of these at my apartment building so that my tenants can have their own local weather channel.


My friend built one. Details here: https://www.youtube.com/watch?v=fvxZvObr2hw


Since the Weather Research and Forecasting Innovation Act of 2017, Congress is requiring NOAA to start acquiring data via commercial partnerships.

NOAA has already made some contracts with Spire [1] and Saildrone [2]. I am sure there are more but these are the ones most familiar to me.

Your weather data broker startup sounds very interesting!

[1]: https://spire.com/press-release/spire-global-awarded-nationa... [2]: https://research.noaa.gov/2022/08/03/noaa-and-saildrone-team...


The entire slate of commercial acquisitions planned or in progress can be found at [1]. It's pretty anemic; NOAA has spent far less than what folks were hoping they would. I think a major part of this is that the private sector really didn't have very many high-TRL observation systems that could readily be integrated into NOAA's assimilation and forecasting systems. Lots of planned constellations and ideas about things to do in the future, but just not that much stuff that was ready to package-up and deliver to NOAA. The most successful acquisitions for GPS/GNSS-RO and buoy/drone data seems strongly bolstered by the fact that these data were already readily assimilated by existing infrastructure.

The private sector has really embellished its capabilities to the detriment of the CDP and other programs. I think too many industry players saw NOAA's expansion here as a potential slush fund to fully subsidize their R&D, but again the TRL of planned observation systems was too low and so the system didn't really work efficiently. Classic policy failure - would make a fantastic case study or Master's thesis for someone studying weather in an STS program!

[1]: https://www.space.commerce.gov/business-with-noaa/commercial...


> Sidebar: I'm a founder of a self-bootstrapped startup to build a better weather data broker that doesn't cost an arm and a leg. If that's something you are passionate about solving, feel free to reach out :)

Would love to hear more about this. I’m a researcher and a lot of my work revolves around machine learning applications to building energy modeling, and one of my projects actually revolves around the difference between using TMY vs AMY EPW files in automatic calibration of models. Would be great to chat more. What’s the name of your startup? I can shoot you an email at the official email.


I have always wanted to build a cheap ESP based weather station I can give to a friend, have them set up their WiFi and it would transmit data to the central server and I do not want to submit data to AccuWeather or wunderground or someone else.

Is there a Foss server around that I can set up on my own ?



My weewx station has been running since 2013 on a RPi.


FWIW, purple air is basically this minus wind information.


This sounds really cool. Have you looked into see if you can get permission from the guys over at aprs.fi to scrape their data from ham radio weather stations running over APRS? [1]

[0] https://aprs.fi/page/api


My rough understanding is CWOP was started by ham nerds and forwards all data to MADIS via FindU[0]

[0]: <http://www.findu.com/>


Thanks for mentioning. When I looked up MADIS, Wiki pointed me here: https://www.weather.gov/ncep/

There, selecting 'Weather Prediction Center' goes to https://www.wpc.ncep.noaa.gov/#page=ovw

There I searched in 'Local forecast' for cities and see most of what 'weather com' delivers, with 3 domains in Noscript instead of the couple-dozen from weather.com


I became so annoyed at the tracking, advertising and poor notifications of many apps, I wrote my own for severe weather alerts.

My hobby will include a C band antenna this fall, and I’m on the hunt for radar data sources in which to create my own mosaics.


What is the best option for someone who wants to contribute but doesn’t have a weather station currently or a lot of free time to take on another project?


For the data, do you know meteostat.net?

They get input from e.g. NOAA (US), DWD (DE), YR (NO) and so on.

I'm not affiliated, just needed some historical data and (finally!) discovered them.


If I may use your expertise a bit: I need air quality data for two cities, one of which doesn't have any air quality monitoring stations. Is there anything I can use as a proxy for air pollution, like satellite data? Also, do you know any sources for that data?

Thanks!

P.S.: I'm passionate about air quality, given I have alergies and was an active member of the bicycle activism movement for about 10 years.



Thank you!!


I don't understand why the quality of the specific weather station does not matter. Isn't bad data impacting the overall model quality?


> NOAA/NWS, for example, is extremely underfunded so if they had to privilege to buy it they probably couldn't come to an agreement to buy it. As a result, they can't use that data to improve the accuracy of alerts/warnings/forcasts, the same exact tools that the big weather companies make all their money from. It's a shit cycle and totally unfair IMO.

Huh? This is kind of an odd take for a few reasons. For starters, NOAA isn't "extremely underfunded"; with the possible exception of the current budgeting cycle, NOAA generally does pretty well and has strong bipartisan support. It could always use more money, but I wouldn't call it "underfunded.

The reason NOAA doesn't buy more data is because most of the available data has limited value. Personal weather stations have substantial quality issues and add almost no value in areas where we already have high-quality surface observations. We thin out and throw away a ton of surface observations already during the data assimilation process to initialize our forecast models anyways - data from aloft is far more valuable and impactful from a forecast impact perspective.

For what it's worth, few if any companies use proprietary observations to improve their forecasts. It's an open secret that the vast majority of companies out there are just applying proprietary statistical modeling / bias correction on top of publicly available data. Only a handful of companies actually have novel observations, and there's limited evidence it makes a significant difference in the forecast. At best, it can result in the way that those statistical corrections are applied to existing forecasts and ensembles - you can count on one hand the number of companies that actually run a vertically-integrated stack including data assimilation of proprietary observations and end-to-end numerical modeling.

That isn't to say there isn't unique value in the observations. It's just that the industry flagrantly misleads about how they use them.


> We thin out and throw away a ton of surface observations already during the data assimilation process to initialize our forecast models anyways - data from aloft is far more valuable and impactful from a forecast impact perspective.

I regularly notice that the NWS forecasts, even in the very short term, get the surface conditions rather wrong. (This is by comparison to a an inadvertent but, I think, quite accurate surface temperature and humidity measurement that I have.)

I fully believe that the measurements aloft do a great job of predicting the conditions aloft, but I wonder whether the results would be further improved by even a fairly simple model to map the forecast results back to detailed surface conditions. After all, many of consumers of weather forecasts, e.g. people caring about personal comfort, climate control energy predictions and pre-heating/pre-cooling of buildings, etc. care about surface conditions more than they care about conditions aloft.


The measurements aloft constrain the entire system - things like vertical profiles of moisture and temperature, as well as the kinematic structure of the atmosphere (e.g. wind profiles) grossly constrain the evolution of the system. Put another way - the _information density_ of these observations is very high and they tend to constrain non-local features of the flow / structure of the atmosphere.

Observations for the surface don't have this effect for two reasons: (1) they can be dominated by local influences (like local topography) that poorly constrain the background atmospheric state, and (2) the majority of numerical weather models do not directly model the planetary boundary layer (the layer of the atmosphere closest to the ground), and instead parameterize processes that occur here. What this means, practically, is that the information content of surface observations is low (1), and even when it isn't, there isn't a mechanism to effectively propagate this information outside of a single grid cell or even column in the actual forecast model (2).

That's why observations are typically used to bias-correct forecast models - it's a form of localization or downscaling.


Very few companies run the vertically-integrated stack because it is prohibitively expensive to do so with current NWP versus what you can sell it for with only marginal forecast improvements. I know several companies have tried this with integrating their own observation sources and ended up with worse performing forecasts. Oops.

I'm very interested to see how the ML modeling revolution changes this. The ability to perform global forecasts on a single GPU should make it cost competitive for more companies. I know several companies are already deriving their own weights for the forecasting component so that they can sell them. Google appears to be working on the next piece of the puzzle too with using ML for the data assimilation step, or skipping that altogether and using observations to go directly to forecasts.


There are a few groups working on leveraging observations more directly in the ML forecast models and skipping over the assimilation/analysis step. However, unlike the original ML forecasting problem (which, let's be honest - was grossly over-simplified by the existence of ERA-5, which has been treated as "ground truth" for the atmosphere and used to teach models how to simply go from state at t=1 to state at t=1+\delta t), there's reason to believe that incorporating the observations will be substantially more difficult, given the complexity and bounty of the observations themselves and the challenge of framing a tractable, useful ML problem on top of them.


Referring to the parent comment, the data which NOAA isn’t able to buy is the government’s data, which is freely provided to non-government organizations. The parent comment doesn’t discuss personal data very much. I think this misunderstanding might be the root cause of your disagreement.


Parent comment is mis-informed. NOAA doesn't have to buy any data from other government agencies or organizations. It's all open and publicly available. There are challenges around the reliability and quality of data that NOAA doesn't take efforts itself to curate and maintain which limit their utility, but that's a separate issue.

More importantly, NOAA explicitly funds the National Mesonet Program [1] to actively identify, acquire, consume, and ingest data from a wide variety of state and federal agencies across the country. The NMP itself partners with a major private sector company, Synoptic Data PBC [2] to perform the engineering necessary to acquire all this data. Synoptic actively maintains the infrastructure which consumes and publishes this data to MADIS for use by NOAA and any other stakeholder.

[1]: https://nationalmesonet.us/ [2]: https://synopticdata.com/


If you see this post, You should go share your story and probably this effort with the Home automation and Home assistant crowd.

As consumers and creators of plenty of weather data you might see a fair bit of traction there!


what’s good weather station hw to get? thinking ultrasonic wind instrument and some type of radar


Doesn't encompass radar (and no home weather station will) but NetAtmo sells one with ultrasonic wind, smart rain gauge, and has a full RESTful API available: https://www.netatmo.com/smart-weather-station

I use this and love it.



Hi Mike- I am very interested in this and doing some work - please email


"The governments of the world provides these big weather companies (weather.com (cough IBM), Accuweather (cough IBM cough), etc) a metric shit ton of their data completely for free (by law) including data transfer. These are things like radar, satellite, ground station data, forecasts, composite models, etc. These companies profit substantially on it, as in billions of dollars"

Michael Lewis' book The Fifth Risk goes IN DEPTH into how Accuweather/Weather.com and the government were interacting (particularly during the Trump administration).

I highly recommend the book in general and for this particular story in particular.


Do you have a source for sites like weather.com and AccuWeather being part of a billions of dollars industry? That is surprising to me.


It's hard to get accurate numbers as many weather sites are privately held. But to give some info:

Weather Company was sold for $2B to IBM in 2015 (and recently sold to private equity for undisclosed amount). Tomorrow.IO has a $1B+ valuation pre-IPO SPAC.

https://www.forbes.com/sites/johnkoetsier/2021/04/08/ibms-we...


That’s not really indicative of a large industry. If the weather company was bought for 2 billion, at standard valuations they were probably pulling at most 100 million per year in net income.


User name checks out.



> AccuWeather's estimated annual revenue is currently $156.6M per year.


You can't safely fly or sail unless you know and can predict the weather at your destination.


That's not a full explanation for these companies being valuable though. Only flight planning makes use of commercial weather forecasts - the actual decisions (take off or delay, land or redirect to alternate?) are made by the pilots based on reports made by the airports themselves, usually based on their respective national forecaster's reports. Commercial pilots don't just bring up weather.com on their iPad on final approach, although their dispatcher might well have used a more sophisticated version of the same thing hours before the flight set off.

P.S. It's easy find online versions of the reports that pilots get over the radio. Here's an example with explanation for Boston Logan International, USA: https://aviationweather.gov/data/metar/?id=KBOS&hours=0&deco...


Big Weather does provide the data for "free" to the weather widgets on millions of phones and computers.

So they do provide a public service, even if maybe they get too much money from it.


Why is this unfair? With these private companies I get access to all this data for free. If the government did it I would have to pay for it out of my taxes. Why do we need the government to provide more accurate forecasts when this is already taken care of by the private sector?


You get that data for "free" or your hand over your browser data - and likely location data, to unknown third party brokers that know more about you than you know about yourself. Weather apps in particular are notorious for selling your location data, since everyone does "show weather for my current location".

It's "free" in the worst sense of the word.


Do you know much about Windy, and whether they have any ties to selling user location data?

All the plots and features seemed nice so I paid for a year of subscription (after Dark Sky was discontinued ugh), but if they do that I’ll just stick to the default apple Weather app.


> With these private companies I get access to all this data for free.

These companies are already receiving government data for free. And then enriching it with private weather station data (and ads). So your taxes are already funding some part of it.

As computing power being used for forecasting increases (14.5 petaflops each for both Dogwood and Cactus NOAA supercomputers), having more granular data is going to be useful in improving their models.

https://www.noaa.gov/news-release/noaa-completes-upgrade-to-...


It is in the public interest to have both weather and forecasts. Maybe you are like me and look out the window to see the weather for that day, but millions need accurate weather and forecasts. They include farmers, those that run power grids, air traffic controllers, and more.


This speaks on long term outlooks at synoptic scale, we really should put some energy on researching mesoscale long term outlooks, or even 4 hour short term. It's a difficult problem to solve as the variables are quite complex, but the reward can be substantial -- on-land severe weather impacts less people but often is deadlier and can cause huge financial loss in areas that may not expect it.


Apple's response, in a nutshell:

- A weird callout to the nationality of the company right out of the gate

- Spotify will be nothing without us (insert crazy ex memes)

- "Our engineering helps ensure that Spotify’s apps can work seamlessly with Siri, CarPlay, Apple Watch, AirPlay, Widgets, and more." -- yeah, because those are your products, Apple. You make your money on those products being tightly integrated

- A bizarre quantization of the apple ecosystem by total number of "APIs" Apple gives them access to (250k)

- A claim of insider coercion between Spotify and the EU Commission that made it difficult for Apple to win

- They are going to appeal


My favorite bit is their key highlighted argument that their anti-competitive behavior has been ineffective, therefore it's not anti-competitive.


It's not even that they say it's been ineffective; they're saying that the EC can't prove that it has been effective.

And AFAIK there's also no reasoning given why it's fine to prevent apps from mentioning other payment options, even assuming it's not anti-competitive. (But to be fair, I haven't actually gone to the source to read their statement.)


Surely effectiveness is a meaningful indicator when you are in a dispute over whether something is anti-competitive.


No, if Apple tries to give advantages to its own music app it is still possible for their app to fail if it is bad enough compared to the competition even if Apple gave themselves large unfair advantages.

In cases where Apple succeeded in killing the competition you wouldn't see a lawsuit like this since the competition wouldn't have the money to sue properly.


This isn’t a lawsuit and it’s not being brought on by competition so I’m not sure why any of that is useful to say.

I did not say it’s impossible to be anti-competitive without being effective. I just said it’s a factor when judging whether something is. So your first point is irrelevant as well.


You should get your nuts checked, that nutshell is a bit rotten.

My nutshell contains something else:

- Apple lamp-shading that EU companies seem to get a pass

- Apple refuting Spotify’s claims that Apple’s acts somehow have harmed and stifled Spotify, by highlighting how successful Spotify is

- Highlighting how Spotify has benefited from Apple’s work without paying a dime, refuting the implied notion that Spotify is on the hook for 30% and refuting the implied notion that no benefit is provided in exchange for the commission by Spotify consistently calling it a “tax”

- Lamp-shading the fact that the scope of the EC’s investigation has changed more often than your average brothel’s bed occupant during the ten years of the investigation due to the EC not being able to make the case on all the other stuff they tried, from consumer harms to harm to competitors, eventually landing on a anti steering provision but only in the music streaming service market and only after that anti steering was already abolished, hinting that Spotify has been the driving force behind this indefatigable mindset to find something that sticks

- They think it’s BS and are going to appeal, according to them because no harm was proven, but I think in part because the actual fine is only $4m, because that’s how shitty the EC’s case was even after shifting the goal post so often, so the EC decided to add a bogus “deterrence” fine. That and of course because this is merely a decision by an executive branch, in that regard it’s not unlike, say, Trump’s executive orders, in that they’re not worth much more than the paper they’re written on until a judge has adjudicated it.


The Outrage out of Cupertino will be even funnier when they will get bonked for their BS Compliance with the DMA


I mean the tone of their initial press release anouncing "compliance" was borderline unprofessional (in global company PR speak terms) and sounded at time like a pertulent teenager fuming about that "unfair" teacher.

I'm sure it was all calucated and discussed a million times but if their goal was to appeal to emotions, I think it mostly failed (outside a very small nieche of Apple groupies). Their leverage in the EU is weak (thanks to few jobs and tax avoidance (even if that's legal))

Once they notice they will revert to lawywers and silent compliance.


>A weird callout to the nationality of the company right out of the gate

not particularly weird, it's a rhetorical implication that the EU favors a European company unfairly.


Its verryyyy clear that Google rushed these models. They have a fiscal obligation to try and stay ahead of the curve and make as much money as possible on these investments. The CEO isn't honestly mad that the model is racist, he's mad that the model might lose the company contracts because its racist. Why would another company risk their image or lawsuits if they make a chatbot that says something illegal or denies rights to protected classes?

Racism is a bias (to say the least), and a bias is a pattern, and ML models are trained to find patterns. The model itself, is doing a really good job at that. The hard part and the part that takes a long time and money is training a model to learn from certain patterns and to not just ignore, but to actively find the patterns it shouldn't learn from. Google's greed chose not to invest enough time and money into that, and its biting them.

Bless the engineers/ICs and low-level PMs/managers doing their best, this isn't their fault in the slightest.


The dataset IS free to download, but running a query against it on Google Cloudis what costs $$$. BigQuery is basically renting servers to scan through the data, which is the fee


The complaint says there should be a warning that processing fees can be high. Go to the front page and check out the links. Nothing really about cost. Someone follows that path and 14k gone without a word about it. That's the path that people are sent down from the website. It explicitly talks about using BQ for analysis.

A simple "running queries over the whole dataset can cause significant costs due to the size of the dataset" should be enough. And I think that's a valid and fair point.

The whole part of accusing Google should just be ignored.


The setup instructions mention what you’re asking.

https://github.com/HTTPArchive/httparchive.org/blob/main/doc...


I can't even find "cost" on that page. Only one rather tiny side note that you could get past the free tier quota.

I don't think that's a proper warning on costs.


> The whole part of accusing Google should just be ignored.

I don't know. Google could trivially solve this problem by imposing an opt-out warning on potentially expensive queries.

"It looks like your query might cost $14k. Are you sure?"

But money.


It probably wasn't a single query costing $14k, but more like 1k costing $14.


Given how small the dataset is there is no query that justifies a $14k charge.

AWS charges $27/hour for a server with 3TB of memory. Enough to run the queries in memory.


BQ charges you based on the volume of data being scanned. I think this is a situation which involves scanning the whole dataset again and again without fully understanding how it works. I’ve worked with much larger datasets on BQ (petabyte scale) and managed to not spend more than $1000 in an hour. Also, BQ tells you how much data will be processed BEFORE you run the query, which makes it easier to understand the cost implications.

Again, you could fit the whole dataset in memory in an EC2 instance and do your thing.


It's easy to make an enormous query by joining to other data (or to the same data), or reading a lot of data.

A regex query on response_bodies would churn through 2.5TB of data every time it's run.


That... is not how lenses work, right? They are showing a totally flat glass surface that close to your eyes? I can't imagine your eyes going to be able to focus on the entire width of your FoV of text and images of that size... without it at least giving you a headache


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: