SpaceX is imminently switching to methane fuel, and an engine design that should offer much more complete combustion of the fuel. I was hoping to see at least a comment about this in the paper.
I don't have a reference handy, but one can ballpark this by looking at (1) the concentration of water in the stratosphere, (2) the lifetime of a gas parcel there, and (3) the mass of the atmosphere there. This gives the natural rate of injection of water, and I think it would be a good idea to keep manmade injections below that level.
Some numbers I've pulled out of various places (not guaranteeing these are accurate): Earth's atmosphere is 5e18 kg, 99% of mass of atmosphere is below 30 km (so mass above that is 5e16 kg), lifetime of gas in upper stratosphere is ~8 years, concentration of water vapor there is about 7 ppmv. Given this, natural water injection into the upper part of the stratosphere is maybe 2e10 kg/year. A Falcon 9 produces about 2e5 kg of water vapor per launch (much at lower altitude, granted). So, it's looking like at most 1e6 F9 launches/year, perhaps less, would be over the limit I set.
Concerns from water vapor in the stratosphere would be ozone destruction and contribution to global warming (I understand 15% of the GWP of methane comes from the water vapor it produces when oxidized in the stratosphere.)
Thanks, that makes sense! Not as limiting as I'd feared.
StarShip uses 1,200 tonnes of propellant, SuperHeavy uses about 3,300 tonnes. 45% of that will become water vapor, the rest CO2. Couldn't find what altitude SuperHeavy reaches, let's say 1e6 kg water vapor to upper atmosphere and we get 20,000 Starship launches per year, lifting between two and three million tons. Probably a good bit less water vapor actually gets that high, but on the other hand it's fairly generous to allow a doubling of the water vapor.
More complete combustion of the fuel they intend to burn. But since their mixture is very fuel rich, they emit a lot of methane. No amount of engine design can eliminate that methane from their exhaust.
I don't think there's much methane in the exhaust at all. Carbon monoxide, yes, hydrogen, yes, but the methane should disappear at the temperature of the gas in the thrust chamber, even if it's not fully oxidized.
LOX-hydrocarbon engines typically operate a bit fuel rich. This is because at stoichiometric ratios, the flame is so hot that you lose significant energy to dissociation. Therefore, diluting the gas slightly by a fuel that adds hydrogen, thereby reducing the average molecular weight, can win back enough of that energy that you come out ahead (higher Isp).
Interestingly, Hydrogen Peroxide/hydrocarbon engines optimize to stoichiometric, since the extra water from the peroxide dilutes and cools the gas enough that this effect no longer is a win. For this reason, the jets from such engines have little unburned fuel and are almost invisible.
Yes. It's because of all the dilution peroxide already causes. The effect is illustrated by that weird British launcher, the Black Arrow, that used peroxide as the oxidizer. The picture of it taking off looks like something Photoshopped, with the vehicle just sort of levitating above the launch pad.
You are looking for maximum impulse, not maximum energy. If you run fuel-rich, the extra fuel converts heat into impulse, it acts as reaction mass, so to speak.
Water injection in jet engines works along the same principles, convert heat into volume of vapour.
OK, but couldn't you do the exact same thing by running lean? Then the extra oxygen would convert heat into impulse. Or, as you said, by steam injection? (But then you'd need an extra tank for the steam, so that's probably less than optimal...)
(I'm not saying they're wrong. I'm asking for an explanation.)
Yes, burning away the interior surface of your combustion chamber runs into a resource limit pretty quickly, shortly followed by rapid uncontrolled dissociation. The exhaust colors leading up to that are often pretty, and finally dramatic.
There have been rockets that ran lean during the first few seconds, deliberately eroding away a layer of the combustion chamber, to get higher-mass particles into the exhaust while trying to clear the launch tower. The higher-mass exhaust particles gives you less efficiency but more thrust, which is most important just then.
You ask a good question! When you dig into it, the goal of almost all modern rocket engine optimization is to create a mixture rich in hot hydrogen. For thermodynamic reasons, smaller molecules move faster at a given temperature, hydrogen is spectacularly smaller than any other molecule, and the most important quality of a rocket engine is its exhaust velocity.
Among the most important qualities of a rocket engine is its exhaust velocity, which determines its efficiency, delta-V per unit mass of fuel/oxidizer burned.
Not exploding is probably the actual most important quality. You don't really start tuning for efficiency until you have got that one nailed.
Sometimes absolute thrust is more important than efficiency. Perhaps surprisingly, efficiency can be traded off for thrust by getting heavier molecules into the exhaust. They are coming out slower than the naked protons, but not as many times slower as the ratio of their mass to the protons'.
From what I understand, a major concern is thermal generation of nitric/nitrogen oxide compounds because the heat of the exhaust gases can split atmospheric nitrogen.
What narrative? That rockets can never be bad? I think rockets' contribution to greenhouse gasses are insignificant as a ratio to emissions:value to society compared to airplane flights/commercial travel.
That is fine, as far as it goes. But it doesn't go very far.
Injecting huge amounts of H, CO and H2O into the upper stratosphere and ozone layer will have important consequences, even if the launch cadence SpaceX wants does not happen. (Which it won't.)