Hacker News new | past | comments | ask | show | jobs | submit login

A Raptor engine has about 1/3 to 1/4 of the thrust of an F-1 engine. However, the Raptor is far more efficient and much more impressive technically. They are really a marvel of modern engineering and science.



Raptor is also a lot smaller. F-1s were intentionally designed as large and high thrust because it'd have been simpler to build, test and control a handful of F-1s at the time, compared to say, 30 smaller but more efficient engines.

The useful metric on this front would be thrust density, where Raptor 1 is a bit under twice that of F-1A, Raptor 2 is a little over twice of F-1A, and Raptor 3 should be ~2.5x of F-1A.


But with more engines they can tolerate a failure or two (or even more) per launch. If a larger engine fails and you only have 3 of them, you're going to have a bad day.

And, they do seem to test the heck out of their engines, even with 30 of them on a ship.


Yes those benefits can now be realized now with modern controls. Back when the Saturn V was designed the control systems necessary to manage 30 engines didn't really exist. Digital control was in it infancy and was only really implemented with a backup on the whole Apollo stack.

Trying to manage that many engines while technically possible with controls of the era (check out the N1) means your control system would be introducing reliability issues instead of adding fault tolerance through redundancy.


Didn't the soviets give it a try? I'd swear they had a large number of engine design way back when also for fault tolerance. Ok, they didn't get it working but I'm pretty sure it wasn't due to lack of digital control... Surely they wouldn'tve even attempted it if it was impossible :)

[edit] ah. That was the N1 you referred to. Ok. So you're saying it was possible, but it introduced more failure points.. So is that why it failed...


N1 had a bunch of problems, the engines could not be fired several times, so they tested them by producing them in batches, then test firing one from the batch and assuming all engines in the batch were the same as that one. This obviously isn't how things work, so engines could just be defective from the start.

The second was, as mentioned, that the control systems of the time were not that great, so they had issues properly compensating for engine failures, causing them to cascade until too many engines were lost to get to orbit.


> then test firing one from the batch and assuming all engines in the batch were the same as that one. This obviously isn't how things work

Curious what approach you'd propose in their place?

> The second was, as mentioned, that the control systems of the time were not that great

True. The control system was also cutting edge, and evolved together with engines, and also was much better by the 5th flight - which was scrapped - than it was at the 1st one.


>Curious what approach you'd propose in their place?

The approach used nowadays, make engines that can be fired (at least on the ground) multiple times. As far as I'm aware, all current generation American rockets can be static fired on the ground to verify that they work.

Edit: Although, come to think of it, not necessarily true with vacuum engines, but even then, they can test the turbopumps and have enough sensors to find potential issues before launching (at least once enough experience has been built up on the engine).


Right, but at the time to save on mass they used tricks like working with negative safety margin, that is, engines were calculated to serve particular number of seconds and performed slightly outside of elastic deformations... They did move towards multi-start engines for first stages eventually, but not during 1960-s. The original idea of using rockets was military, and those guys had hard time to understand why such a thing should work multiple times, I guess.

Vacuum engines can actually be tested on Earth, some special devices which produce external pressure decrease when the engine is running (like, if you run engine in a tube, the hot gases will push all the air from the tube making a pressure drop).


You also need strict QA and minimal deviation both from specs and between engines.

That was another issue the Soviets had a hard time dealing with.


I have what is probably a dumb question: How can a Raptor turbopump need almost double the HP of a F1 while putting out 1/3 or 1/4th the thrust? (Assuming Elon's 100k HP number was correct, and/or hasn't changed). That just doesn't settle out for me. If it's got double the power, it should be moving double the fuel, so double the power, no?


The formula for Isp - the important measure of efficiency of the rocket engine - says that the speed with which the engine throws away hot gases grows with the difference of pressures - before nozzle and at the exit of the nozzle.

The whole idea, by the way, of the full-flow combustion is to extract some more energy from the fuel - before sending that fuel to the chamber, and at the temperature which the turbine of the turbopump can tolerate - so that energy could be used for pumps and more pressure could be created in the chamber. More energy than "more conservative" closed-cycle engines.

The pump power is equal to the volume flow (how many cubic meters, or, say, liters of fuel the pump transfers per second) multiplied by the pressure (which pressure is at the exit of the pump). So, it's not the flow - it's the pressure where Raptor has a big advantage over F-1, and that pressure allows to have a better Isp.

And of course the better Isp allows to reach bigger characteristic velocity (or just a velocity in a free space, where gravity or atmosphere don't get in the way) using the same amount of fuel.

The logic goes roughly like that. Every rocket engine designer wants to reach bigger Isp. For that, using a particular fuel, one need to reach bigger pressure in the chamber, and we move from pressure-fed engines (like the first French orbital rocket, Diamant, which had pressure-fed first stage) to pumps, because high-pressure tanks are too heavy. Pumps initially are open cycle, or gas generator cycle, but we throw away enough gases after the pumps' turbine, so next improvement is we get a closed cycle. With closed cycle we can choose to use all fuel or all oxidizer to move the turbine, but as soon as some component is used fully, we can't get more energy for pumps. Eventually we go to the more complex full-flow cycle, which uses both components and reaches the highest pressure in the chamber.

The next step would probably be a detonation engine :) which uses somewhat more efficient process to convert chemical energy into speed, but it's not yet developed enough. We can also talk about more heat-resistant turbines which would allow to extract more energy from the fuel and to increase pressure some more... but there we also have a lot of R&D ahead of us.


Raptor is much smaller, F-1 has a diameter of ~3.7m, Raptor has a diameter of ~1.3m, so it's pumping out 1/4th-1/3rd the thrust in 1/8th the area.


Raptor has much higher chamber pressure (35 MPa vs 7 MPa of F1) and hence higher Isp (380 vs 304 in vacuum).


35MPa is ~5000psi for anyone wondering lol.


And I took the effort to convert it to metric… Seriously, NASA uses the metric system and so should you.


Maybe fuel might play a role. The Raptor burns methane, the F-1 refined petroleum. Another possible reason is that the designs may make different efficiency vs power trade-offs.


Maybe something to do with pressure. Maybe it is higher chamber pressure and maybe higher pressure even with lower flow rates could require more power.


Perhaps it has to do with the relative densities of RP-1 versus methane?


> Raptor engine has about 1/3 to 1/4 of the thrust of an F-1 engine

Correct for Raptor 2. Raptor 3 is closer to half [1][2].

[1] https://en.m.wikipedia.org/wiki/SpaceX_Raptor

[2] https://everydayastronaut.com/raptor-engine/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: