Hacker News new | past | comments | ask | show | jobs | submit login
“True” Damascus steel is not a “lost art” (reddit.com)
196 points by memorable 2 days ago | hide | past | favorite | 99 comments

This article doesn't address the 1998 discovery that the "lost" art was the presence of vanadium in the ores that real damascus blades were made from. The vanadium gave the steel its strength from the 13th to 17th centuries or something like that. It was present in iron ore mined in a particular region, and those mines eventually ran out, after which the "recipes" for damascus steel weren't lost, they simply stopped working and no one knew why. They had no way to tell the difference between iron ore from one place and ore from another.

This is explained in the 1998 Verhoeven, Pendray, and Dauksch (VPD) article]1] that the reddit post cites without touching on this crucial aspect. Or anyway, I thought that VPD's explanation was now generally accepted. This isn't anywhere near my field though.

[1] Verhoeven, John D. et al. “The key role of impurities in ancient damascus steel blades.” JOM 50 (1998): 58-64.

I think one has to seperate the steel alloy, which covered, and the forging techniques used. The former ran out, at which point steel quality decreased. Lets call that steel from Damascus.

And there is what is now commonly called Damast steel, basically folded and twisted steel of dofferent properties. That technique is as old as blacksmithing because it was, more or less, the only way to get rid of impurities. The Japanese developed this technique to the extreme by not just getting rid of impurities but by gettind specific steel properties to defined parts of a blade.

European sword smiths stopped doing that, first European steel became better and more homogenous and didn't require anywhere near as much imourity removal. Secondly, spring steel is pretty sturdy monosteel, so need need to get specific steel in specific parts of a blade. And lastky, European bladed weapons from the Napoleonic era onwards were industrially mass produced (as compared to the manufacture mass production from earlier periods). And mass production means cost reductions, that didn't start with MBAs. Sonce those weapons were good enough to kill people by stabbing, cutting and slashing them Damast steel wasn't needed. The obvious exception are expensive master pieces forged individually. And the better quality "industrial" ones.

Damast steel has renessaonce at the moment, mainly because it looks great. And it is a way for knife makers to upsell, otherwise not a singpe one of them could compete on price.

The Japanese way of blade making has the benefit of having properties of the alloy at, e.g., the blade (hard and sharp) and the core and back of a blade (softer and stronger). Damast steels properties are all over the place, meaning it has no real value, IMHO, regarding a blades capabilities when it c;mes to cutting. Those blades so, as the Japanese ones, are falling into the region of art so, and there the optical properties matter a lot. And there Damast is just beautiful.

From experience so, Damast steel bars are hard to get rigjt, I'll go out on a limb and say Japanese bar steel is easier (if you forgo the tradidional Tamahagane way that is). Finishing the blade is easier woth Damast so, a proper Japanese blade requires so much delicate heat treatment and polishing to get right, it is borderline rediculous.

> so need need to get specific steel in specific parts of a blade.

This is not quite true. Even today, there are many secrets surrounding metalworking. The highest quality "blades" (such as plow blades/moldboards), may have steel with quite diffent qualities in different areas of the blade.

The blade may start out as monosteel, but differential exposure to oxygen, carbon or different temperatures in different parts of the blade is a common way to create blades that have different microstructures (crystal structures, chemical bonds, microshapes) as well as a carbon content that can vary within the blade. These helps them to retain an edge better, better avoid scratching, while maximizing durability and minimizing weight and thickness.

True, but those blades start live as monosteel or, highly expensive specialized and top notch tech, sintered semi finished pieces. The latter is much closer to being a modern version of what the Japanese did than the former. The former can also be done by induction hardening. I researched that a bit a while ago, and it is basically highly controllable hardening for e.g. gears and surface hardening.

Metallurgy is quite fascinating.

Metallurgy is quite fascinating.

It is, but with all the above upthread said, I feel I'd probably get a high quality blade, if I just take my mower's 2 foot mulching blade, and use it as a sword.

I wonder how many current masterpieces, from antiquity, were created by having 20 apprentices pump out blades, and just picking the best out of 1000 blades for a respected client.

> I wonder how many current masterpieces, from antiquity, were created by having 20 apprentices pump out blades, and just picking the best out of 1000 blades for a respected client.

Probably 0. Swords were not that random. High quality swords would either be made from Wootz/Crucible steel, which was an expensive resource (but relatively easy to forge into a good blade) or by using various folding methods, that was immensively labour intensive.

A smith would not have apprentices produce 1000s of such blades. That would be too expensive.. Low quality swords would be produced using less labour intensive methods, and sold in larger numbers. These production methods would ensure that they would not reach the highest level of quality.

Top-quality swords would be made very carefully by or under supervision from a master, in much smaller numbers. Some of these may not have had obvious flaws, but not 99.9%.

Steel, regardless of quality, was an expensive commodity up until the early 20th century. Up to then iron to steel ratios were roughly around 9:1, maybe even with a higher iron rate in pre-modern periods. Then it flipped, and today you have a hard time to get simple iron at all.

One of the reasons steel weapons were reforged, faulty blades reworked. And handles, as well as the inner parts of the blades, used to be iron. Also in Japan sometimes. Simply because it is cheaper, and the sharp edges between handle and blade are less brittle and this less critical during heat treatment. And heat treatment of non-standardized steel is a black art indeed.

That being said, European blade making was shared enterprise since the earliest days. One general contractor signed with a customer, then he sub-contracted a smith for the blade, one for the cross guard and handle, one for polishibg, one for scabards and, if was not the same guy, one for assembly. That's were the english word cuttlery comes from. Otherwise a single smith would produce maybe a dozen blades per year, maybe even less. Not even close to what was needed.

This list of medieval prices lists two swords: https://regia.org/research/misc/costs.htm

One costs 1625 shilling, or less than the cost of a female slave.

The other costs 24000, more than the price of 9 female slaves.

Clearly, for the second sword, the steel was not the main part of the cost.

> https://regia.org/research/misc/costs.htm

As an aside, that website is a great example of what the web used to be. Absurdly informative, fun to browse, and obviously a labor of love.

Same as today, I once investigated how to buy a new, but tradionally forged, Katana. Turned out that a) antiques are cheaper and b) the mounting (simple wood vs tsuba and all) can double the price, easily. Like modern day cars, the options drive the price even if the engine / blade are the same...

Compared to the price of female slaves, blades have become cheaper, though. I could only afford one female, and as it turns out, she expects me to be HER slave :)

> Then it flipped, and today you have a hard time to get simple iron at all.

Very true. Simple elemental iron is extremely hard to find; possibly because nobody wants it. (Whenever you see phrases like "wrought iron" or "cast iron" or "ornamental iron" today that's all steel, not true iron.)

The best source I've found for true iron is wagon tires. These are the metal bands that were wrapped around wooden wagon wheels in the 1800s, and many of them are pure iron. Sometimes you can find them at antique stores or in a farmer's field. They survive long after the original wagon has rotted. They're always very rusty but that's no big deal.

Why would you want simple iron? Because it's very soft and much easier to work than steel. And if you cut into it slightly and bend it at the cut, you'll see the marvelously beautiful huge crystal structure that only true iron possesses. It's wonderful stuff; just not very durable compared to steel.

You can buy pure electrolytic iron powder from powder-metallurgy vendors. It's of course not 100% pure but it's a lot purer than 19th-century iron tires.

Iron is hard to purify. This is partly because it's so enthusiastic about reacting with other elements. It's like the Oprah Winfrey of elements, or the gender-swapped version of oxygen: "You get an electron, and you get an electron, and you get an electron! Everybody! gets! an! electron!" Worse, in nature it's always hanging out with its buddies manganese and nickel, and they're hard to separate. It's not as bad as zirconium and hafnium, but almost.

It wasn't until the 01930s that reasonably pure iron was produced, and it was found that it was much softer than anyone had previously suspected.

I should clarify that my desire for iron is in the form of bars that I can work in my blacksmith shop, and iron in that form is hard to find in my experience.

I'm sure chemical-grade iron is available but melting iron powder into ingots in an oxygen-free environment is beyond my capabilities ATM.

I wonder if you could decarburize molten mild steel into wagon-wheel-tire-purity iron by blasting air or oxygen through it for a longer period of time than a steel mill would find profitable. This involves temperatures well above normal blacksmithing temperatures but it doesn't require a dangerous oxygen-free environment.

Interesting idea. So you're saying that the oxygen will bind more readily to the carbon than to the iron itself, and carry it off as COx? I'm not a chemist but I know iron loves to bind with oxygen (i.e. burn) at high temperatures; that's why oxyacetylene cutting torches work so well.

Yes, as I understand it, that's how steel mills make steels from pig iron, by blowing air through them to remove carbon (as well as things like sulfur and even phosphorus). It heats up the molten steel, too, so once you get it going you don't need to keep applying heat from the outside, so your furnace fire doesn't need to be able to reach the melting temperature of the final pure iron (1538°) but only the melting point of what you start with, say, cast iron (as low as 1147° but usually closer to 1400°). Have you done cast-iron casting?

I understand that when you're blacksmithing normally having your iron get hot enough to become a self-sustaining fire is a risk to be avoided rather than a goal but I feel like this might be less alarming if it's in a ladle covered up with a layer of flux and slag so that it stops burning whenever you stop blowing bubbles into it through a straw.

There's obviously the difficulty of what to make your straw out of. Apparently the standard answer in modern basic oxygen steelmaking is, "a water-cooled, copper tipped lance with 3–7 nozzles is lowered into it to within a few feet of the surface of the bath and high-purity oxygen at a pressure of 700–1,000 kilopascals (100–150 psi) is introduced at supersonic speed," but I imagine graphite or aluminum phosphate would work too and might be a bit less exciting.

Apparently the standard way of doing this from Han China until the invention of the Bessemer converter is a thing called "puddling" which sounds a bit less violent but also slower. Wikipedia explains, "Working as a two-man crew, a puddler and helper could produce about 1500 kg of iron in a 12-hour shift. The strenuous labour, heat and fumes caused puddlers to have a very short life expectancy, with most dying in their 30s."

So, maybe not something you want to do every day, but it seems like something you could probably do a few times in your lifetime—I'm guessing 1500 kg of iron would last you a good long time of hobby blacksmithing, and we have a lot more respiratory PPE now than we did in the 01800s.

I've always been curious about casting but never tried it. One of these days.

In some countries there are foundries and schools that do weekend classes where they walk you through the steps and already have all the necessary equipment and safety knowledge. You might see if you can sign up for one. Anything involving molten iron has a tendency to be, on the human scale, incredibly violent; as a consequence, it requires great delicacy and care.

Or you could start with pot metal or aluminum.

I'd pick an old leave spring over the mower blade, but you are right. Doing blades myself right now, I never got why people use old car springs. Sure they are cheap, and there I am and I paid 300 bucks for 6m of 24x8 mm 5071 spring steel straight from a supplier. That's a lot of blades. And medieval blacksmiths would have killed for that steel.

I use car springs to make tools for my shop but yeah, if I were making blades for sale I'd probably start with the good stuff.

Not that used car springs are bad material wise so. I'm just way to lazy to spend hours hammering into a bar-like shape before I start hammering into the shape I want. Much simpler to start with a proper bar from the beginning.

I know a guy who starts usually with a spiral screw for his blades, and those blades are beautiful.

Not that I'm representative, I made a couple of blades out of Japanese multi-layer steel. And do have some rather expensive Japanese multi-layer steel bars laying around for later, one in a proper Katana-like structure with an iron back and one with 22 layers of chrom-stainless steel with a middle layer of paper steel, both around 8mm thick. They cost a factor 4 more than the ordinary spring steel I mentioned earlier. But since I don't have the time, nor the skill, to make those raw multi-layer bars myself I bought them for my tries at Katana-like blades. One day that is, I planned my first sword to be done in early 2020 and still didn't get started...

The japanese blades consist of four different kinds of steel. A soft core, the sides, the back and the cutting edge. The characteristic curve of the blade only comes into being in the hardening process, because the cutting edge expands more then the rest.

Source: Watched when japanese smiths, who have the status of a priest in japan, showed their craft for the first time outside of japan.

Yeah, There are all kinds of cross sections on high quality Japanese sword blades. Or rather simple ones, no way every foot soldier had a 20 times folded, 4 section (or whatever that is actually called) perfectly polished and heat treated blade.

Actually, come think of it, there are contemporary sources decribing how to straighten bent blades on the battlefield over some rock to continue fighting.

If you are interessted, there is a book on that topic (https://www.amazon.com/Art-Japanese-Sword-Swordmaking-Apprec..., Amazon was the first result that came up), it is really comprehensive covers everything from raw Tamahagane to the scabard, including guidance on polishing and heat treatment. Can only recommmend it! Buy the hard copy so, the eBook is unreadable.

Generally speaking, foot soldiers didn't have swords at all. Most of them were given a much cheaper to make spear.

Through out the ages every soldier carried a dagger, a sword and a main weapon. All made cheap if course. The sword is a side arm, basically serving the same purpose pistols do in the military today. A spear is also so much better on a battlefield, can be used in formation, has surperior reach and is cheaper to make.

> Lets call that steel from Damascus.

That's confusing to me, what exactly was from Damascus, the steel or the forging technique? Reading the article OP cited says: "This suggests the possibility that the low levels of vanadium found in the genuine wootz blades of Table III may have resulted from ore deposits in India where the wootz steels were produced. "

It’s called Damascus because that was the entrepôt from which it entered Europe. It’s actually from South Asia. A similar thing happened with fabric... muslin and damasque are from India, not Mosul and Damascus

Wikipedia says both muslin and damasque originated in or around Mosul and Damascus, though. Their production moved to India later on, due to comparative advantage, I imagine.

the citation here is just from the 1911 britanica, which is not a terribly reliable source! For what its worth, transparent muslin is depicted on the walls of ajunta (in the Vessantara Jataka painting in cave 17) from the sixth century, and are mentioned in the Harsacarita, from the seventh, both from before Mosul was a major center.

He’s partially right. Damascus was made from Wootz steel which was imported from India. The technique of adding carbon to the steel to harden it causing the characteristic bands was discovered in India.

Damascus steel I've heard about over the years was supposedly named in one of the crusades (Damascus being in the middle east, and the crusades being in the middle east) and european knights encountering swords of vastly superior sharpness, strength, durability, and flexibility. This was noteworthy because ideal swords are strong and sharp, but flex so they don't shatter/become brittle (which is bad in combat...), with strength and flexibility being opposing properties when forging swords. Too

The Damascus blades somehow exceeded european crusade era steels in all three areas, which was thought impossible. And, superficially at a minimum, these swords had a pattern.

This simple etymology isn't mentioned, nor is the crusades, the sharpness/flexibility/strength/superiority, which seems strange.

I suppose you'd need a sampling of the crusades era weaponry and its properties, and try to find a real damascus blade with similar superior properties to verify the military advantage, and then figure out if it is just better. I've seen a lot of reddit/HN postings and I don't know if the superior swords were actually demonstrated.

I recall one post speculating carbon nanotubes forming back when "carbon nanotubes" was a very clickworthy headline term.

Can you take relics and heirloom swords and destroy them for testing? Probably not. Thus, the mystery. Are the "damascus swords" we do have actually the militarily superior ones? Were they just slightly better but more uniformly performant, while knight swords were a bit more variant?

Who knows. I don't think this article really cleared up much nor lived up to the central claim "we can do that, it's not a lost art".

It didn’t come from specific mines. Damascus was made from wootz steel from India. The Indians discovered the technique of adding carbon during the smelting process to harden the steel and give it its characteristic bands. It was a well known technique and they knew exactly what they were doing.

Seems surprising at first sight that they still bothered to do the fancy process thing, if a regular blade made from that ore would have been about just as good. But us who live in the 21st century should not have any difficulty at all understanding the power of brand.

I imagine it going like this: first, blades from a certain region start making a name for themselves for superior quality (thanks to the ore). Then a few things happen side by side, augmenting each other:

- higher prices allow smiths to spend more time per blade

- blades from that region become a status product, giving them almost exclusive access to the market segment that is willing to spend extra on fancy craftsmanship, resulting in even more time per blade

- export markets get flooded with fakes that only claim to be from the high reputation region, some of the fancy craftsmanship ends up serving as a sign of authenticity

> Seems surprising at first sight that they still bothered to do the fancy process thing, if a regular blade made from that ore would have been about just as good.

That's because it's not that simple. Ore plays a part, but the main "secret" to "True Damascus" steel is the Crucible Steel process that was almost unique to South/Central Asia. Most steel producers in the Medieval era were not able to properly melt steel, since they were not able to create a sufficiently high temperature. But a few, primarily in India and Central Asia (and probably at least one place in Europe, associated with the Ulfberth brand) knew how to melt pig iron (that has a lower melting point) together with iron or steel to create Crucible/Damascus/Wootz steel.

Most Europeans and East Asians did not have access to such steel, meaning that European (except a few 'True Ulfberht'), Chinese, Korean and Japanese were made from inferior steel compared to swords made from Wootz/Damascus steel.

That doesn't mean that all such blades were bad. But that is where the intricate process came in, and without access to Wootz steel, creating high quality blades required a lot more labour and skill.

Thanks for the corrections, I guess I might have accidentally explained, in the fake camp, that "other kind" the reddit post was referring to. I wonder if there are examples of both variants combined, perhaps to get an even stronger visual effect?

Durong the Viking era people forged Ulfhardt swords, branding is as old people can draw stuff it seems.

This sort of crucible steel was one of the only known ways to make steel before the modern processes were invented. It wasn't a "fancy process", it was the only process.

This isn't true. First the terminology is a bit weird. Steel is an alloy of iron and carbon. This fact was not known until the 19th century! Classial "wrought iron" (whih has a carbon content under 2%) has a similiar composition to what is now called mild steel (it has somewhat different physical properties because wrought iron has inclusions of slag because of how its made). Classical steel has a somewhat higher carbon content. Pig iron (or cast iron) has an even higher carbon content, and is the product of a blast furnace. Steel could be made in a number of ways. The process of creating wrought iron naturally produces some higher carbon steel, which could be removed. This is how japanese swordmaking works (they take higher carbon steel, and low carbon steel, and pound the together). The finery process, invented in china, but heavilly used in europe, involved taking pig iron, and slowly oxidizing it to remove carbon. For armor, case hardening involves baking wrought iron in carbon to cause the surface to become steel. The cementation process from early modern europe involved doing this, and then folding the steel to produce uniform metal. Indian steelmaking didn't involve pig iron, since india didn't use blast furnaces.

R.I.P Alfred Pendray

Here is a film of them making a wootz ingot and discussing vanadium.


Thanks for that link. The vid is a 50 minute documentary about damascus steel, featuring Al Pendray and others. I've only watched a small part of it but it looks really good. I will try to take in the rest soon.

I didn't know this, that's really interesting. I, like I think many others, was under the impression that Damascus was a lost technique. I'm under the further impression though that our modern alloys have long since passed the standard of these ancient Damascus steels. Is that true?

No historical steel beat modern day steel alloys, not when it comes to quality nor to predictablility. You heat treat a certain SAE spec stell in a certain way and you get tye same result, within accepted and specified tolerances, regardless if batch. No such thing in the past. Also, modern day steel is just dirt cheap, even the higher spec monosteels.

Some modern steel alloys from Sweden with extreme strength costs more than silver.

It depends on production quantities, for sure. And don't get me started with aerospace grade heat treated high tensile stuff. The high spec mass produced stuff is still extremely cheap, if you can get it in small volumes. And whether or not those more esoteric alloys actually improve blade quality, well, I'm not sure. But the again at the higher end we talk more art work than tool, and art os following its own rules.

It is a similar myth with Roman cement and, to some extent, Greek Fire. We know how it was made (with some assumptions when it come to greek fire), what was the main ingredient and composition and we know how to make it much much better than they did. But the legend continues even after many debunking because we love a good story of ancient arcane art that was lost in time.

About Roman cement, maybe we know how it was made, BUT for a number of reasons we decided to use other types of cement that not necessarily are "better" than the Roman one, particularly when it comes to durability.

The ubiquitous modern Portland cement and concrete made with it hardens faster and has initially a higher resistance.

Modern Pozzolanic cements, more similar to the Roman ones, are in practice very rarely used, even if they have most of the properties of the old Roman ones (longer time to harden, hardness/resistance increasing over the years, generally much better resistance to water).

Concrete made from Portland cement lasts at least 100 years and generally we don't need more. If anything, we want the flexibility of stuff that "only" lasts around 100 years, because you know, due to new tech and changing economies, we want to be able to model our cities periodically.

And Portland cement is cheaper and can be made in much larger quantities.

This is kind of a recurring story with these topics: we know how to make it "better", but we don't really need that "better", we do need "cheaper" and "higher availability" instead.

> Concrete made from Portland cement lasts at least 100 years and generally we don't need more.

"We" may not need more (as en "we" that are alive today). But future generations would thank us if we would make structures that would last for 500+ years instead of 100 years. Much better for the very long term economy AND the climate.

Would they really?

Sure, having an old Colosseum or Aqueduct to look at is nice, but do you really want to live in a place where every single square inch of useful ground in the whole country is occupied by an ultra-resilient building that somebody thought would be useful 2000 years ago?

We have all of these examples of nice long-lasting roman architecture because those are the examples that survived. Romans didn't live in the Colosseum and the aqueducts. The Romans built hundreds of thousands of other buildings that fell down on their own or were demolished for any number of reasons over the millennia.

In support of this, we can even see the rapid changes in structural demands over half a century in a culture like the US. Suburban sprawl eats up spaces with any viable flexibility in a matter of years, and things like shopping malls fade out of fashion just as quickly as they entered it. In a relatively short time, we are left with brick & concrete hellscapes that are largely being abandoned because developers find it easier (and probably cheaper) to work with fresh land. This Disposable Mentality gets us into obviously environmental and economic messes so often that I am still flabbergasted that the US at large has not adopted a more Reusable Mentality, and incorporated some long-term thinking into their suburban planning.

You can always demolish unneeded structures.

The Golden Gate bridge was opened 85 years ago. Is there any sane reason to demolish it in 15 years?

I said 500+, not 2000. If, 500 years from now, the best 25% of structures built by us remain standing, I think that's a good thing.

> But future generations would thank us if we would make structures that would last for 500+ years instead of 100 years.

Would they? 100 year old houses are already dinosaurs in terms of energy use, and often it's a lot easier and cheaper to demolish and rebuild instead of isolating and installing modern heating. I can't imagine how outdated a 500 year old house would be.

It depends on what elements you take into account.

Demolishing a building and re-building it in terms of total energy (and/or emissions) is not free of costs, maybe you don't pay them directly, unlike heating or air conditioning, but they do exist.

Right now 100 years old houses (the brick or stone ones, not the wooden ones, nor the reinforced concrete ones that are usually more recent) already exist and can be restored/upgraded (though of course with some limitations) with a minimal amount of work (in terms of energy and emissions).

More or less the "if ain't broken don't fix it approach".

Now, if we had some building material lasting only 100 years that could be manufactured with little expense of energy and low or no emissions, that would be another thing.

If we imagine that (hypothetical) there is a form of (say) square section bamboo that we can grow at little or no cost and that we can assemble with (still say) some vegetal cement or similar, so that the results of the periodical demolitions can be reused or recycled or that is however biodegradable, then a short lived building would make much more sense.

> Right now 100 years old houses already exist and can be restored/upgraded with a minimal amount of work (in terms of energy and emissions).

Having lived in one, I can assure you that upgrading such a house to modern energy standards isn't a minimal amount of work. Lots of these houses just have a single brick wall directly facing the outdoors. Insulating that means you have to basically completely strip the house down to the brick, and even then it won't be as good as new construction.

Of course demolishing and rebuilding also takes energy; but if you amortize it over 50 years or so, I bet in the end you come out ahead. Building technology has improved a lot in energy efficiency over the last century.

> Insulating that means you have to basically completely strip the house down to the brick

No, insulating that means that you add an extra set of layers to the outside of that brick wall, which can done without impacting the interior while the people are living in that building during the renovation.

> while the people are living in that building during the renovation

You’ve never actually done this I assume? I have lived in a 1920s house during renovation. I wouldn’t wish it on my worst enemy. Should have torn it down; would have been better for the environment and my pocketbook.

> You’ve never actually done this I assume?

I have, and there was a lot of drilling into concrete filled with tiny stones, that caused enough noise to cause pain in the ears. Only during working hours, though.

> would have been better for the environment and my pocketbook.

If you don't want to live in a building as it is being renovated, you can always rent something else for the time it takes. If the renovation takes less time than rebuilding, you also pay less rent.

My point was that the old house is still very energy inefficient compared with new construction. The rehab process was also very energy intensive and created nearly as much waste as a tear-down.

The beautiful hand-carved crown moulding and stairs are nice, but not worth outrageous heating bills.

We don't have a very good idea what sort of structures people will want in 500+ years, and there are lots of better ways we can turn money into improving life for people centuries in the future (primarily in trying to reduce the risk humanity goes extinct before then).

Only for the record, the 100 (hopefully more) years life is expected in houses/buildings, more like 50-60 years if used in reinforced concrete structures open to the weather (like bridges and viaducts), then some maintenance/cortical rebuilding/consolidation is usually needed.

This is not only due to the kind of cement, there are a number of other factors that have an influence on the (scarce) durability of modern reinforced concrete structures.

Portland cement may be cheaper than pozzolanic in some markets/countries and the contrary may be true in some other ones, but you cannot anyway make a direct comparison, as they are normally used in different kinds of structures.

The "let's build things that have a set expiry date" approach could be a very good one IF we actually knew how long a construction lasts (or should last) and matters would have been organized so that this continuous demolishing/rebuilding could be planned and carried on, but this is not what AFAIK happens anywhere, exception maybe for a few (BTW wooden) temples in Japan.

>50-60 years if used in reinforced concrete

Structures which need a longer design life than that have switched to stainless steel rebar, which currently has an unknown life span. Check back in a century or two.

My grandma’s mountain village is located just below some big hydro dam (the second biggest in my country, Romania), built in the early 1960s. So that gives the authorities (or whoever will actually own that dam at that point) another 40 years to either tear it down completely for safety reasons (I don’t see that happening, the incoming money is too good) or to tear it down and build a similar one in its place (I also don’t see that happening, to be honest).

And this is, comparatively speaking, a happy case, I’m sure there are lots of big dams built back in the 1930s in places like the US or the former USSR that are only 10 or so years from that “this concrete-structure is only guaranteed to last 100 years” time-point.

You say, "Modern Pozzolanic cements, more similar to the Roman ones, are in practice very rarely used".

Wikipedia says, "Over the course of the 20th century the use of pozzolans as additions (the technical term is "supplementary cementitious material", usually abbreviated "SCM") to Portland cement concrete mixtures has become common practice. Combinations of economic and technical aspects and, increasingly, environmental concerns have made so-called blended cements, i.e., cements that contain considerable amounts of supplementary cementitious materials (mostly around 20 wt.%,[clarification needed] but over 80 wt.% in Portland blast-furnace slag cement), the most widely produced and used cement type by the beginning of the 21st century.[5]". https://en.wikipedia.org/wiki/Pozzolan

Who's right? Are pozzolanic cements "the most widely produced and used cement type" or "in practice very rarely used"? I'm guessing it's the writer who doesn't think "pozzolan" is a proper noun in English.

Portland is the most used.

However nowadays it is rare that a Portland based cement is "pure" any Portland based cement may have pozzolanic components, i.e be a "blended" cement but the use of pozolanic components even in relatively large percentages does not make them "real pozzolanic cement" (but they are, when there are large amounts of pozzolanic components what I called "modern pozzolanic cement", the "base" is usually anyway Portland).

"Real pozzolanic cements" are fabricated/created with a different process and from different base materials, they are not really used anymore, exception made for particular restoration works.

I think it stems simply from a (possibly willful) misinterpretation in pop culture. Instead of reporting that "we don't know how they made this amazing (for its time) material" they reported that "we don't know how to make it" and it gives rise to further fantastical hypotheses, culminating with ancient aliens or something along those lines.

Agreed on Roman cement but AFAIK there is no definitive explanation of ingredients for Greek fire, e.g. to explain how contact with water actually fed the flames. The Wikipedia article has an extensive section in the manufacture but it’s all very speculative.

I was under the impression with Roman concrete that we modern folks can reverse engineer it and understand it just fine, but that the art was lost for 1k+ years.

Is this a myth I have fallen for?

The myth that persists to this day is "the Romans made better concrete that not even we, with all our science, can match." It's perfectly true that medieval Europeans had no idea how to make roman concrete.

Ah right, thanks for the clarification there :)

Sounds about right. “Damascus steel” is very highly romanticized in the forging/sword making communities online.

The history of lost techniques is fascinating. One example is Coade Stone, an artifical stone of which the lion sculptures in London's Trafalgar Square are made:


My favourite ancient example are the vitrified forts found in Scotland and a few other places - large amounts of stone walls partially melted - nobody knows how this was done or indeed whether it was intentional:


Clearly the vitrification of these forts was the result of wizard battles.

Duh, stone was melted when the Targaryen family used their dragons to conquer Scotland and make it one of the Seven Kingdoms.

One of the larger vitrified forts, Tap O' Noth, provided me with imagery that I used to visualise Weathertop from LotR - this was well before the movies.

I did even check to see if Tolkien had ever visited the area to see if there was any connection but no luck - unlike, for example, Lauterbrunnen in Switzerland being the real world Rivendell.

Hm, I thought that Scotland pledged fealty without a fight, the last king of Scotland just bent the knee.

You've got it the wrong way round - the last King of Scotland actually also became the King of England, the countries were united by a political process some years later - largely to pay off debts following one of the most ill conceived colonisation attempts in history:


No. No. No. King Torrhen Stewart, "the king who knelt", sweared an oath of fealty to the dragon king Edward Targaryen, first of his name. It is known.

TIL artificial stone start-ups were big in the 18th century.

I spent a week in a workshop a few years ago helping make damascus style steel and titanium. Hot and hard work and had a brilliant time. One of my more favourite memories from vacations.

Sounds interesting, where was the workshop? :)

Rural Mississippi.

Actually, if it's not made in the steel making regions of Syria, then it's only Sparkling steel.

underrated comment

Also - actual mettalyrgist talking about knife myths


Gotta love a several-hundred word essay about swords posted on reddit by user "IPostSwords".

It's a bit odd that the article defines the "lost" art form as the crucible steel, and not folded steel, and then goes on to prove that it hasn't been lost. But the claim I've always heard is that the exact recipie for the folded steel has been lost. You can make identical looking folded steel, but it is apparently difficult to make folded steel that has the exact composition of the antiques.

Folding steel never went away and was independently invented in several different cultures around the world.

It depends on what you want: we don't know exactly what they did. However, we can produce better steel no matter what criterium you have for stiffness, strength, corrosion or oxidation resistance (sorry I'm just guessing the English terminology here, I'm not a native speaker)

Right, it is like we 'lost' the exact recipe for certain paints. We can create much more vibrant, durable, repeatable paints with modern chemistry, but we don't know exactly how they did it. Of course neither ancient painters nor blacksmiths were great chemists, but they had certain heuristics they used to use to create their stuff. I think it is not a case of "we can't make steel / paint / buildings as good" but rather we don't know what their process was. It's more of historical interest.

That was my reaction; the "Damascus" steel whose art has been lost isn't crucible steel, whose art was never lost; it's folded steel.

[Edit] Incidentally, I supposed the name "Damascus" was related to the patterned silk fabric named "damask", rather than the origin of the material being the Damascus area.

The problem with that is they weren't great chemists. So we don't have the exact recipe, but neither did they. It isn't lost knowledge, it was never had knowledge. If someone really wanted to know they could get a complete chemical analysis of a sample.

Can they reproduce this effect:


Have you actually read the article? The last few paragraphs address this:

> N.B: A brief note on the claim carbon nanotubes exist in crucible steel:

> The only articles that "found" carbon nanotubes was published as a brief communication to Nature, i.e not a peer reviewed article, not a full article. This was in 2006, and was only a few pages in length.

> It later found its away into a conference paper by the same authors, still not a peer reviewed article. This was 2 pages in length. These findings should be considered preliminary.

> The method used (dissolving crucible steel in acid and seeing what remains) revealed stands of carbon, but carbon dissolves VERY readily into steel. Crucible steel is typified by cementite spheroids, which often stretch into rods during forging as they are deformed. If you dissolve cementite in acid, removing the iron component, you are left with carbon.

> This does not mean there was an intact carbon nanotube in the core of the cementite rod - and even if it DID mean that, it would have negligible impact on performance because it is *encased* in cementite, which itself is in a soft matrix of pearlite or sorbite.

> But don't take my word for it. Other academics, including those who have been instrumental in understanding crucible steel (namely John Verhoeven) doubt the findings.

> " John Verhoeven, of Iowa State University in Ames, suggests Paufler is seeing something else. Cementite can itself exist as rods, he notes, so there might not be any carbon nanotubes in the rod-like structure."

> "Another potential problem is that TEM equipment sometimes contains nanotubes, says physicist Alex Zettl of the University of California"

> https://www.nature.com/news/2006/061113/full/news061113-11.h...

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact