Hacker News new | past | comments | ask | show | jobs | submit | jphoward's comments login


Is anyone is the AI/ML area finding success with anything other than conda, where installation of CUDA/CUDnn is required? Although I often have to pip install a lot of packages, I find conda's nvidia/pytorch/conda-forge channels are still by far the easiest way to get a deep learning stack up and running, and so I just stick with conda environments. I've tried poetry in the past but getting the NVidia deep learning stack up and running was really tough.


For anything related to CUDA/CuDNN, use one of NVIDIA base Docker images. Then whether you use Conda / Pip / Poetry / Pipenv does not matter much. Not at all a Conda fan myself and avoid it like the plague


What's surprising to me is that this isn't better known. The only reliable solution I've found is to go with the pytorch or deepstream images from NGC. Conda is probably a good idea for noobs who need Cuda installed for them on windows, but otherwise I find it an endless source of finicky issues, especially for unsavvy ML scientists who are looking for a silver bullet for package management.

This link shows which package versions come in which Docker tag and is invaluable: https://docs.nvidia.com/deeplearning/frameworks/support-matr...


10 years ago, « Data Science » work past the experimental stage was performed by SWE with a knack for applied maths. So investing in tooling to do things properly was a given.

Nowadays, most DS people only want to do ML at the experimental stage only and get lost when things get on the engineering side of things. But for their defense, nowadays the bare minimum skills require to do programming, containerization, CI/CD, etc. More experienced and swiss army knife SWE/MLE have to educate the willing.

It was already the same 10 years ago with MATLAB dudes not wanting to get dirty with C/C++/ASM SIMD. The history repeats itself, only at a faster pace


Yes. I simply do

  python -m pip install torch torchvision
and it works. It used to not, but it's been fine for me for about a year now.

There's a very good chance I've installed cuda on my system before this though. And usually cudnn and some other packages because this is part of my standard install. And then I also never run into the issue where a package is looking for nvcc.


I love poetry but have found it pretty hard once you move off of anything that doesn’t manage to get wheels on pypi.

We make extensive use of conda/mamba to solve this, and are pretty happy with it, especially with conda-forge.


I have successfully transitioned an ML/AI team of seasoned researchers away from conda and to poetry. Some also use pyenv, I suspect a lot don't bother but may get bitten eventually.

It's definitely a learning curve, but it turns out every conda user has been bit by the irreproducible tendencies of conda quite often. Nobody uses the conda env file, they just start an env and pip install things into it. They don't realize the base env has stuff, too, and conda envs are hierarchical rather than isolated. I know it's possible to use conda in an isolated and reproducible way, but have yet to meet someone that does so.

So it hasn't been hard to pitch poetry to these folks, and while many complain about the learning curve they appreciate the outcomes.

We're a pytorch shop, and torch mostly just works with pip or poetry these days, as long as you skip the versions the torch maintainers mispackaged. We rarely need anything higher-level that only conda could install.

We really like having more than two dependency groups as this allows us to keep research and production in the same repository. main, dev, research. Then researchers contribute to the core library of a project and keep research and production using the same code for running and evaluating models.


I use pipenv and I've found it to be much more usable than conda. For my use cases, it's generally faster and I've run into fewer dependency issues.


uv has been really awesome as a replacement for pip: https://github.com/astral-sh/uv

So fast it finally made virtual environments usable for me. But it's not (yet) a full replacement for conda, e.g. it won't install things outside of Python packages


How about prefix then? https://prefix.dev/blog/uv_in_pixi


Pyenv just worked for me. I am actually using Fedora Silverblue and have GCC and the CUDA SDK available only inside a toolbox container. Therefore, I have to enter that toolbox to install things like FlashAttention.


Have you tried https://pixi.sh/ ? It brings Cargo/NPM/Poetry like commands and lock files to the Conda ecosystem, and now can manage and lock PyPI dependencies alongside by using uv under the hood.

I haven't been using anything CUDA, but the scientific geospatial stack is often a similar mess to install, and it's been handling it really well.


I use poetry and direnv. Coming from node/npm, it feels natural for me to just do this. I have really no troubles of installing Pytorch with poetry


How are you installing Pytorch with CUDA with Poetry? I stopped using Poetry because it wouldn't automatically get the CUDA version; instead, it would install the CPU version. I migrated to PDM, which does the right thing.


Before CUDA 12.0 you have to specify a field in pyproject.toml like this

    [tool.poetry.dependencies]
    python = ">=3.10,<3.12.0"
    torch = {version = "^2.0.1+cu118", source = "torch118"}
    torchvision = {version = "^0.15.2+cu118", source =     "torch118"}

    [[tool.poetry.source]]
    name = "torch118"
    url = "https://download.pytorch.org/whl/cu118"
    priority = "explicit"
However, since CUDA 12.0 and Pytorch 2.1.0, just install like normal

    poetry add torch torchvision


I stand corrected. I was familiar with the first option, which coupled the dependencies with the platform, whereas I wanted a CUDA version on Linux and a Metal version on macOS.

However, this works perfectly with Poetry 1.8 and Pytorch 2.2. I suppose the only problem is what PDM also does, where the lock file is platform-dependent. I'm not sure whether Poetry allows you to select a specific lock file, however.


was this before torch 2.0? With the very notable exceptions of a few mispackaged versions, torch now includes all the relevant Nvidia libs, and I haven't seen it grab the CPU version on a GPU box yet, though I'm not sure what it looks for.

A notable open issue in poetry is we can't currently specify one dependency on torch, and have it grab CPU version on some systems and GPU on others. Does PDM solve that?


I don't think PDM solves that directly. What I do is have different lock files for different platforms (e.g. Linux/CUDA and macOS/Metal), but pyproject.toml lists only "torch".


I was excited about JAX, but I think the developers missed a trick when they decided it should be entirely immutable. It sounds silly, but I think if I have an array `x` and want to set index 0 to 10, it's a big mistake if I can't do:

  x[0] = 10
And instead I have to do:

  y = x.at[0].set(10)
Of course this has advantages, and I know it sounds lame, but as someone whose brain works in numpy, this is really offputting.


I think it is a feature from functional programming. In some level, I more agree with jax style.


But you can convert automatically a mutable code into a functional code if that makes things easier. That's what Haskell's `do` notation does, and PyTorch even has `torch.func.functionalize` for that. Immutable should be default, but not compulsory.


Haskell's `do` doesn't allow mutability still, it just allows a syntax that looks a bit more imperative than usual. The problem with all the "convert to pure function" magic is that for example this piece of code

    arr[1000000] = 1
has to clone the entire array if you want it to be pure, leading to very unpredictable performance. There are also some algorithms that are straight up impossible to (efficiently) implement without mutability. Often, it's exactly those algorithms that are hard to optimize for optimizers like JAX.

Specifically in JAX, code that is slow due to copying will often be optimized into mutable code before running for performance reasons. But because JAX still has the gurantees of no mutability, it can do many optimizations such as caching or dead-code elimination.


Of course `do` itself is much more capable, but it has an effect of the conversion for some monads, which was what I wanted to say.

You are correct about in-depth mutations and resulting complications, but that only strengthens my assertion: immutable should be default, but not compulsory (because sometimes you absolutely need them). And mutability doesn't preclude caching or dead-code elimination; you just have to be more careful. Often it's the case that you can convert a mutable code into an immutable form only for the purpose of analysis, which is definitely harder than an immutable code in the first place but not impossible. Scalar compilers have used SSA---an immutable description for mutable programs---for a long time after all.


JAX is a wrapper on top of XLA. Instead of writing pure python, you're writing JAX abstractions.

for ex, a simple loop in JAX:

  def solve(i, v): return i+v
  x = jax.lax.fori_loop(0, 5, solve, 10)


I like opencv, but can there be a stretch goal where if they raise $1M they agree to move from Blue/Green/Red colour channel order by default to RGB in OpenCV5, like everyone else on planet earth? I genuinely avoid opencv for this reason because I know I will forget to BGR->RGB on loading and RGB->BGR before saving occasionally, and so I'd rather stick to skimage/imageio/PIL. Or hell, add an `as_rgb=False` parameter to the load/save functions or something!


> The reason the early developers at OpenCV chose BGR color format is that back then BGR color format was popular among camera manufacturers and software providers. E.g. in Windows, when specifying color value using COLORREF they use the BGR format 0x00bbggrr.

For anyone curious about why OpenCV uses BGR: https://learnopencv.com/why-does-opencv-use-bgr-color-format...


The problem is the discussed results are comparing proportions of a relatively small number - 67 questions. If you model this as a binomial distribution, then 62/67 which GPT4-turbo got gives a 95% confidence interval of the 'true' performance of 83.4% to 97.5%, ie it comfortably includes the proportion that GPT4 achieved (64/67=95.5%).

I think the evidence from these tests are not strong enough to draw conclusions from.


Yes. I see people make this mistake time and again when evaluating LLMs. For a proper comparison, it's not enough to simply throw less than a hundred questions at it and point to a single digit difference. Not to mention that LLMs have some inherent randomness, so even if you passed the exact same tasks to the same model you would expect some variance.

I see a lot of room of improvement in how we apply statistics to understanding LLM performance.


I’m not surprised, most people can’t even tell the median from the mean.


> I think the evidence from these tests are not strong enough to draw conclusions from.

I've used gpt4 turbo for some coding problems yesterday. It was worse. That's enough to draw conclusions for me.


One thing I always wonder is how did all these countries manage to find enough gold to run an (albeit tiny) economy off them? I've never heard of/seen a gold mine in the UK, and yet 2000 years ago they were mining enough to mint currency. Was it all relatively surface level and rapidly mined out, and now all gone?


Phenomenal book that cover's this exact topic, "Debt: The First 5000 Years." https://en.wikipedia.org/wiki/Debt:_The_First_5000_Years

    "The book argues that debt has typically retained its primacy, with cash and barter usually limited to situations of low trust involving strangers or those not considered credit-worthy"
It would make sense that cash would pop up once the Romans arrived, and would be in small amounts to facilitate spot transactions between Romans and the pre-Roman peoples of Britain and why there's such little amounts of cash.

Further more, I can imagine a scenario where Roman coins were melted down to make these coins (total conjecture) .


The book is a combination of anecdata and speculation presented in a polemical style. Not really something that I would call “phenomenal”.


the book stimulates important discussions on economic history and the roots of our financial systems. While its style is assertive, its contribution to questioning established economic assumptions is undeniably valuable. its also backed by anthropological and archaeological evidence.


Unfortunately, Graeber is not well-versed in economic theory so him “questioning established economic assumptions” often resembles fighting windmills or not even that.

If you know any actual contributions to economics or the history of economics that were consequences of his stimulation, I would be glad to hear about them.


My feeling when I read Debt was that he was smart enough to understand conventional economic theory. My brother studied economics and he was the one who actually recommended me the book.


As a second nomination, I hold a PhD in economics and found the book interesting.


Graeber's background in economic anthropology offers a fresh lens through which to view economic history, highlighting the social and cultural dimensions that traditional economic theories sometimes overlook. His work has encouraged interdisciplinary dialogue, prompting economists and historians alike to incorporate broader socio-cultural understandings into their analyses. While his approach differs from conventional economic theorizing, it complements it by adding depth to our understanding of economic phenomena.


I don’t say this as an accusation, but your writing is remarkably similar to ChatGPT output.


Missing the giveaway "However..." clause that nearly all ChatGPT descriptions have in them.


I made a prediction that I was going to find out that Graeber was a communist thinker before I looked him up just now. I was not surprised.


There's a rather unusual strain of Marxism (not communism!) in anthropology. As I understand it, Marx argued that the conditions of the material economy would ultimately dictate what social structures appeared. Ideology was "downstream" of economic production, and less important.

Now consider archaeology (which is part of the anthropology department in the US). A high-profile dig may involve many specialist researchers: people who study seeds, people who study pollen, people who study abrasion in stone tools. If the evidence is sufficiently preserved, then a team like this can lean quite a bit about food production and trade patterns. Meanwhile, nobody can tell you much about ideology. Maybe you've got some burials, or some stone statues that might be religious. But you've got zero written records, and anything you say about religion or ideology is likely to be completely made up.

So in an anthropology department, "Marxist" may mean, "deeply interested in the means of production, which we have lots of concrete material evidence about, but much less interested in making unsubstantiated guesses about religion."

Or at least that's how my anthropology professors explained it.


> Or at least that's how my anthropology professors explained it.

Yeah, that's how they explain it, yet somehow it always ends up being about communism.


Graeber wasn’t t a Marxist or a communist. (he actual refutes Marx ‘s theory of material conditions by pointing out several Native American tribes from the Pacific Northwest who had the same material conditions but organized their societies in radically different ways.)


It is uncharitable to Marx to simplify his whole work down to a "theory of material conditions". It would be hard to find a modern Western anthropologist or a sociologist who wouldn't be indebted (pun intended) to Marx.

There is a famous phrase attributed to Milton Friedman, "We are all Keynesians now". Even if many economists may not share his view, his mode of thinking has been deeply integrated into modern economics. The similar thing can be said about Marx in relation to the kind of anthropology and sociology Graeber was doing.


Perhaps, but David Graeber wasn't a Marxist. David Gaeber was an anarchist who was very sympathetic to/with ideas around direct democracy.


Depends on what you mean by "Marxist", I guess. His anarchism is certainly way more Marxist than anarchism of Proudhon or Kropotkin.


Bikeshedding this conversation about culture, anthropology, economics, and philosophy into armchair analysis of scholars' opinions certainly shows how shallow it is to try and dismiss a massive book by a world class academic as "anecdata" because you don't agree with some of its conclusions.


I don't understand why you are trying to accuse me of bikeshedding. Seems like you are just trying to dismiss me because I don't agree with you? I've asked numerous times for actual contributions to economics or the history of economic thought based on the book in question. But it seems that people enjoy pontificating about Marxism or "offering a fresh lens" more than a grounded discussion.


Sounds like bike shedding to the rest of us. Anthropology has a lot to say about economic history, particularly when it comes to debunking some of the foundational myths like Smith's 'barter theory.' Graeber's take is legit anthropological critique, not armchair econ. His lack of an Econ PhD doesn't negate the value of his work in the least; after all, a lot of what passes for economic 'common sense' is actually historical narrative, which is exactly where anthropology excels. Cross-pollination between disciplines is how we get past stale paradigms.

As per 'direct contributions,' to what? This is vacuous. If you're looking for direct policy changes, new economic theories, or shifts in economic practice explicitly derived from that book, the evidence might be less concrete given the book's recent publication and its cross-disciplinary nature.

if contributions are broadly understood as influencing the discourse, prompting reevaluation of economic history, or enriching economic thought with anthropological insights, then Graeber's work has clearly made an impact. The book has been widely discussed and cited in various academic and non-academic circles, suggesting that it has stimulated thought and conversation, although not be immediately quantifiable in economic terms.

It's worth noting that the impact of theoretical work often becomes more apparent over time as it permeates through discussion, critique, and successive scholarship.

I'd also like to point out that economic anthropology is an academic field in it own right, for which Graber is considered a significant contributor. Graeber's work, in particular, has been pivotal in encouraging economists, historians, archeologist, etc, to think more critically about the origins and functions of debt, money, and economic systems. Economies are complex, culturally rich phenomena, not just market transactions (something Econ models often miss.)

https://en.wikipedia.org/wiki/Economic_anthropology


> Anthropology has a lot to say about economic history, particularly when it comes to debunking some of the foundational myths like Smith's 'barter theory.'

It is not a foundational myth. It is certainly a topic of interest for some economists, but it is not something that you would get asked during your qualifying exams. Dynamic stochastic general equilibrium models are far more foundational.

> a lot of what passes for economic 'common sense' is actually historical narrative, which is exactly where anthropology excels

Everything is a text, therefore a literary studies professor is an expert on everything. Everything is a result of human action, therefore an economist is an expert on everything. Everything is a result of social interaction, therefore a sociologist is an expert on everything.

That’s a dangerous attitude that’s unfortunately common among such fields as sociology, psychology, literary studies and economics. It overstates the expertise of people in the field and mystifies the field itself.

Being an anthropologist doesn’t make you an expert on QM and GR just because the history of physics is a history of narratives.

> Cross-pollination between disciplines is how we get past stale paradigms.

There are lots of people who already do that out there. Economics is ripe with such examples, both past and recent. But being hostile and acting as if you know more than people who study the subject for a living leads nowhere.

> I'd also like to point out that economic anthropology is an academic field in it own right, for which Graber is considered a significant contributor.

And so is economics, for which Graeber is not considered a significant contributor. And that’s okay. The gift economy of Madagascar and the technicalities of the federal reserve system are very different topics. And it is possible to know a lot about one of them without knowing much about the other.


So we're moving goal posts, first it was, "actual contributions to economics or the history of economics," now its just "economics."

Cool.


I have simply responded to your comment that seemed to me to be full of errors and misunderstandings. I even tried to not be antagonistic, yet you seem to be hostile and view it in the framing of goals and goalposts.


You make my point perfectly by accusing me of marxism out of the blue when I never even manifested any opinion on anything even near it.


Grabber was an anarchist, neither a communist nor Marxist.


Autocorrect strikes!


Right, but are there _actual_ contributions to economics or the history of economics that were consequences of his “stimulation”?

Also, saying that he single-handedly prompted “economists and historians alike to incorporate broader socio-cultural understandings into their analyses” is a huge denigration of institutional economics, behavioural economics, Austrian economics, social economics, etc.


It's a relatively recent addition to the discorse, having been published just over a decade ago.

It's definitely apart of the heterodox tradition in economics (without diminishing what's already there), which often takes longer to be integrated into the mainstream.

So, I guess time will tell?


I would even take any contribution to heterodox economics. But what was actually contributed?



> Graeber is not well-versed in economic theory

David Graeber was a professor at the London School of Economics. It appears they believed he was well-versed in economic theory :)

https://blogs.lse.ac.uk/condolences/2020/09/03/professor-dav...


He was a professor of anthropology.


David Graeber was an anthropology professor at the London School of Economics, with a focus on Economic Anthropology. He was also an anarchist and activist. His most significant contribution to the field of economics is his critique of two important ontologies of orthodox economic common sense.

First, Graeber challenged Adam Smith's idea that the history of economics evolved from barter to money to credit. According to Graeber, this sequence is actually reversed: credit systems existed first in pre-money societies, where neighbors kept track of mutual aid. Barter only became prevalent when money was introduced and sometimes unavailable.

Second, he critiqued the socialist theory of primordial debt, which suggests that humans are born with an infinite debt to the cosmos. This theory often uses religious language to support its claims and argues that the government inherits this cosmic debt, leading to total control over money and markets. Graeber distinguishes between religious or moral debt and economic debt, arguing that they operate on different logics.


None of that has much to do with modern economics. Just like criticising alchemy doesn’t make you a contributor to chemistry.

Buy MWG’s Microeconomic Theory and study it. What does it have to do with Adam Smith’s theories on barter or someone’s (whose?) theory on primordial debt?


False equivalence. The history of economics to economics is not analogous to alchemy and chemistry. You're creating a straw man.

As per the myth every student of economics learns, that money grows out of barter. The idea is that monetary exchange solves the problem of the double coincidence of wants. Money makes trade much easier, so the story goes, and thus becomes a remarkable example of both human ingenuity and economic progress, isn't true. There's no evidence to support it. (support that money comes from barter, etc.)

Pick up any Economics textbook and look up the definition of "Traditional Economy," here, I'll do that for you.

"A traditional economic system is based on customs, history, and time-honored beliefs. A traditional economy is an economic system in which traditions, customs, and beliefs help shape the goods and services the economy produces, as well as the rules and manners of their distribution. Countries that use this type of economic system are often rural and farm-based. Also known as a subsistence economy, a traditional economy is defined by bartering"

We know that's not true. We know there's no evidence to support it.

So what does modern economic theory have to say about Traditional Economies? Not much apparently.

And that's the point!

Adding to this, I'm familiar with MWG's Microeconomic Theory, and while it's an excellent resource for understanding the mathematical models used in economics, (and by economics, I means systems where there's a market, money, all actors have perfect access to information, and there's property rights, etc), it doesn't offer much insight into the historical or anthropological questions that Graeber raises.

Both approaches have their value, but they serve different purposes and answer different questions. Learning or applying MWG in no way subtracts from Graeber's insights.

You can't apply or generally model a "traditional economy" using by applying MWG.


> False equivalence. The history of economics to economics is not analogous to alchemy and chemistry. You're creating a straw man.

Smith, Mill and Ricardo are to economics what alchemy is to chemistry.

> As per the myth every student of economics learns, that money grows out of barter.

I wasn’t taught that. I was taught game theory, the Arrow-Debreu model and statistics.

> Pick up any Economics textbook and look up the definition of "Traditional Economy," here, I'll do that for you.

I don’t remember my textbooks saying much about traditional economies.

> "A traditional economic system is based on customs, history, and time-honored beliefs. A traditional economy is an economic system in which traditions, customs, and beliefs help shape the goods and services the economy produces, as well as the rules and manners of their distribution. Countries that use this type of economic system are often rural and farm-based. Also known as a subsistence economy, a traditional economy is defined by bartering"

I don’t know which textbook it is from. It also doesn’t go much into detail what it means.

> We know that's not true. We know there's no evidence to support it.

We know that metallic money were the norm during that times. That probably was the intuition and evidence behind the barter idea.

> So what does modern economic theory have to say about Traditional Economies?

> Not much apparently.

> And that's the point!

The point you were making initially is that modern economic theory makes false claims about barter. In fact, it doesn’t concern itself with it much outside of niche subfields. That makes Graeber simply wrong.

> it doesn't offer much insight into the historical or anthropological questions that Graeber raises.

Yes. And it doesn’t claim to. So what is the problem? How can it be wrong about things it doesn’t assert or imply?

> Learning or applying MWG in no way subtracts from Graeber's insights.

Yeah, but learning about modern economics from Graeber would make you confused and mistaken. He should have had the courtesy not to speak about things he didn’t know.


> I don’t remember my textbooks saying much about traditional economies.

thats the point.

> Yeah, but learning about modern economics from Graeber would make you confused and mistaken. He should have had the courtesy not to speak about things he didn’t know. (What is he speaking about that he doesn't know? also, did you even read the book? I'm getting the sense you didn't.)

He's not talking about modern economics, he's talking about the history of debt. This isn't hard to understand.


> thats the point.

> He's not talking about modern economics

I wish it were the point. I would jump on that bandwagon for a ride with him. But the point is “contribution to questioning established economic assumptions”.

So we get weird statements like “the Myth of Barter cannot go away because it is central to the entire discourse of economics” that people parrot on the Internet after reading Graeber despite the fact that the pre-historical barter or its absence is inconsequential for modern theory.


From my limited number of anthropology courses, I can assure you that the suspicion between the anthropology and economics departments is often mutual.

You know the joke about how physics is the study of spherical cows of uniform density in a frictionless vacuum? That's because intro level physics makes lots of simplifying assumptions. And if a physicist tried to use those assumptions to lecture a dairy farmer, the farmer might assume the physicist was a fool.

Anthropology tends to assume that too many economists study "spherical humans of uniform density in a frictionless market," basically. One of my anthropology professors actually covered these disputes, including specific cases where U Chicago economics professors attempted to advise governments around the world, and wound up totally misunderstanding particular situations.

Now, anthropology has its blind spots, too. Cultural anthropology has been a bit too willing to believe research describing exotic social structures. Archaeology is fairly sound on the nuts and bolts of pre-historical goods and food sources, but it is sometimes blind to how ideology shapes culture. (Which is a safely conservative stance to take when working with pre-historical cultures, to be fair.)

The factors Graeber describes aren't totally surprising. Lots of real world economies run on complicated webs of personal relationships and favors—just look at investors, for example. Or look at the pre-modern property rights described in Seeing Like a State. Land ownership and harvesting rights in a medieval village could be ridiculously complex. Or the customary payments and "gifts" that people made. For another modern example, consider office politics in a large corporation.

I think Graeber's basic case is plausible: complex debts and obligations seem to underly many band-level and village-level societies, especially when central state power is weak. These arrangements can seem bizarre: I remember a video of an interview with a pig farmer, probably about 60 years old, who was organizing a gift of hundreds of pigs to a neighboring village. This was apparently some kind of competitive gesture designed to elevate the status of the giver. And the farmer was really into this. He was complaining that kids these days were shockingly lazy, and that they had no appetite for hard work, and that they had no hope of putting together a proper gift of pigs. And how can you get anywhere in life if you can't embarrass a neighboring village by giving them more pigs than they gave you? It was a status display, similar to throwing conspicuously expensive parties to outdo your social circle.

The anthropology literature contains a ton of odd behavior around debts, obligations, and complex traditional rights. In a pre-modern community of 60 to 5,000 people, only a fraction of the economy seems to involve currencies or direct barter. Currency is a fantastic simplifying technology. And as Graeber points out, complex traditional webs of debt can be pretty brutal towards people who don't fit in.


I think this whole discussion is related to epistemological attitudes. Anthropology and History tends to lean more heavily into the irreducible complexities of society and to analyze and describe them in detail while usually avoiding to deduce grand theories. Economics on the other hand is very used to a more reductionist, mechanistic view inherited from the political success of early 21st century physics in changing the outcome of wars. Economists saw a chance to be taken seriously in policy making by pretending they were doing social engineering.


Obviously his contribution is shattering the barter myth. You might consider that insignificant, but it is something.


I don't mean to be uncharitable but you need to read up on what actual experts say about things Graeber says. He's Joe Rogan level knowledgable, and that's not a compliment.


Where can I search for this kind of discourse? There's a lot of "actual" here without names.


I find that this is true about most non-fiction books I have read. Even when extensively cited, I often get the impression that the author is drawing large conclusions from small data.


Mortimer Adler says that history is closer to fiction than many other forms of books. It's not even data. Autobiographies in particular are rarely true, but not entirely false. It seems that the more data one has, the more likely they are to be biased in interpretation.


Read some actual history books (as in, academic historians, not pop historians).


Got any recommendations? I am pretty deep into the Hamilton biography which seems very extensively researched and widely praised. It's an excellent story if nothing else.



And very tendentious.


I've seen some work on this as well. It would make sense in small tight knit communities to help out your neighbor when you could and vice-versa. There would be a relatively small need for coinage until things became a lot more complex and urbanized.


That's still true nowadays if you limit 'cash' to just physical currency.


its not quite the same, I explain why in the comments below this one. :)


As early as the Bronze Age, Britain was part of wide-spanning trade networks that funneled Cornwallish tin to the empires of the Near East. I would imagine that even before the invention of true coinage, various quantities of gold and other precious metals were circulating in Britain from those Mediterranean sources.


> As early as the Bronze Age

Note that while Great Britain didn't have a whole lot of gold, it did have a whole lot of tin.

You can't make bronze without tin. People who want to make bronze will give you gold for it.


>Cornwallish

Cornish would do


Interesting question - it seems like there were Welsh mines in Roman times:

https://en.wikipedia.org/wiki/Dolaucothi_Gold_Mines

> They are the only mines for Welsh gold outside those of the Dolgellau gold-belt, and are a Scheduled Ancient Monument. They are also the only known Roman gold mines in Britain, although it does not exclude the likelihood that they exploited other known sources in Devon in South West England, north Wales, Scotland and elsewhere.


Thanks.

The wiki on this mine is quite extensive.

https://en.m.wikipedia.org/wiki/Dolaucothi_Gold_Mines


To complement the answer "yes, there was surface level gold which was simply mined, and early", also 2000 years ago is not that long ago. By that time there was commerce going on across all of Europe, and gold had been used in coins and jewelry or cult items by most of these cultures for a long time BEFORE that. So that any specific gold could actually have come from elsewhere.


I have been in a gold mine in Wales, I think it was this one that is Roman https://www.visitwales.com/attraction/visitor-centre/dolauco...


Gold is relatively common, it's just that most mines were mined completely or to the point where further exploration was not economically viable, so few were preserved, though many mines that we have been used recently for other minerals did contain some gold in the past.

From first google link (https://www.bullionbypost.co.uk/index/gold/gold-mining-in-th...):

"Gold has been mined in Scotland for over 2,500 years. There was gold mining in Crawford from the early 1500s" - and that's just a few examples.

I recommend reading about or visiting Great Orme if you are interested in mining, it's a copper mine that was in use since bronze age.


What’s surprising to me is that cultures all across the world agreed that this shiny metal was valuable and could be readily exchanged anywhere for goods and services.

We know that gold is valuable today because of its distribution, availability and metallurgical properties. But random tribes who haven’t even seen an iron tool somehow decided that this shiny metal was scarce and valuable enough to hoard and desire.

Is it something in the metal itself?


It's shiny and very easy to work and doesn't rust. The complete opposite of iron, which didn't really become that widespread until the bronze age trade routes collapsed (tin and copper aren't found in the same place). Bronze was also a much nicer metal than iron back in those days. Much, much easier to work and about as strong.


Incorruptible - as in, doesn't rust - and is commonly found in a state that doesn't need to be refined.


> Is it something in the metal itself?

Yes. I never understood it myself, until the first time I held a heavy gold necklace. The feeling is hard to describe.


I've felt that too. But then I wonder if this is a true feeling or if its just cultural training - countless stories have consistently told me that gold is valuable.


> Is it something in the metal itself?

Gold can be easily melted to combine larger and smaller portions together or split them up. It is "inert" meaning does not easily react with anything else so you don't lose it too easily. It is a great store of value.

And think why does Bitcoin cost so much? Because of its scarcity.


Scarcity is actually an incredibly valuable property. Control over gold production meant control over the money supply. Otherwise, anyone could inflate your currency into oblivion.


Just imagine what would happen if the money supply consisted of pieces of paper that the government could print any time it wanted more money!


Wait until you learn that banks can just make more money by increasing numeric values in databases. "Money supply" is very poorly understood in armchair denouncements of modern central banking.


> Wait until you learn that banks can just make more money by increasing numeric values in databases.

I know how fractional reserve banking works. That "more money" is represented by the collateral for the loan.


Unprecedented wealth and living standards?

it’s not very evenly distributed wealth but our enormous economies are not limited by supplies of an arbitrarily valuable metal.

All money is imaginary anyway.


You can't print wealth from a printing press.


Probably mostly because it was scarce. Additionally because it would've been the only metal that didn't tarnish compared to other metals that did.

I mean gold has always been and will always be (until we start space mining I guess) scarce, can see similar effects for things like aluminium, for example here: https://clintonaluminum.com/aluminum-was-once-worth-more-tha....

So based on that I would say it was purely thanks to scarcity. Money is used to represent x human effort/time after all; it takes time to find the gold, then time to process it. But the same applies to various other weird currencies used in the world like seashells/beads: https://en.wikipedia.org/wiki/Shell_money#:~:text=Shell%20mo....

I feel like there's an intermediate step with these where seashells/beads/similar objects had cultural significance and therefore value because they had to be found (and sometimes worked into beads). It's a step from pure trading of useful items like a knife or an item of clothing to an intermediate currency like seashells, then to precious metals (which are still valuable even if melted down) to what we have now (currency as a symbol of trust that the currency is worth what it's worth as enforced by a government or some other system).


Have you never held or owned any gold? Its amazing. Its like really hard clay, but its metal, AND doesnt rust, its amazing! Its hard to understand the question to be honest, I mean, its GOLD!


Gold is both easy to work with and make finely detailed jewellery and is THE shiny metal - a gold broach, or anything gold would have shone out on your clothing far more than any other metal.

See this video of a Roman era coin being unearthed by a metal detectorist - untarnished or corroded after 2000 years:

https://www.youtube.com/watch?v=CsmF3p4jVV0&t=10s


How many products from the Roman empire does this buy you? None? It didn't store anything. It's not a store of value at all.


They didn't necessarily need gold. For example in bronze/iron age trading it was common to use relatively standardized small bars of other metals for payment. These were not coins, but approximately had the same function.


In a few rare spots of the world, gold is literally just in the dirt and rocks.

There's certainly some gold in the UK; there's still probably thousands of extractable tonnes of gold in the UK. Whether a deposit is economic to extract is a different question. Very few sites in the world can compete with the gold mines of Canada, China and Australia with their very rich deposits.

> Was it all relatively surface level and rapidly mined out, and now all gone?

To some degree this is a factor. Copper and tin are other resources you'll find are already heavily extracted in Europe:

> The main mining district of the Kupferschiefer in Germany was Mansfeld Land, which operated from at least 1199 AD, and has provided 2,009,800 tonnes of copper and 11,111 tonnes of silver. The Mansfeld mining district was exhausted in 1990.

It's not so much that they literally ran out - there's still plenty of copper there. But it was only viable to run in the East German (Communist) economy. Now that most is extracted, there are diminishing returns. It takes more labour and processing and etc. than extracting from a deposit elsewhere would.

When Europeans came to North America and reached regions that had never had a particularly high population density and had never had much mining - like in parts of the Rocky Mountains - they sometimes literally found gold dust lying at the bottom of riverbeds and chunks of gold ore sticking out of the side of cliff-faces. (Cue up a gold rush.) Europe's first large-scale miners probably had a similar experience of abundance once, many thousands of years ago.


It came from trade. The exact provenance was not important.

Herodotus tried to figure out where all the “stuff” is coming from, but mostly found stories he admits are far-fetched.

Modern historians take pleasure in proving his “myths” to be fact.

“Apparently there is some place in Asia where gold is mined by ants!”


Most of the economy back then was non-monetary.

If you're interested in this area, look up the Inca Empire. It did not really have money at all.


I'm fairly sure that's not accurate. Source for "most" economies not having money?

It would appear based on some simple googling that "money" has existed in many cultures going back 30,000 years, in two forms: "money of account" and "money of exchange". Of both of those they have taken various forms. Minted coins did not appear until around 3,000 years ago.


"Monetary economy" means that it uses money for the exchanges, rather than barter. Most pre-industrial economies were not monetary.

Barter economy certainly existed, probably from before the Human Sapiens. But _money_ is a relatively recent invention.


> Barter economy certainly existed

What is the best evidence for this historically? Anthropologists strongly dispute this idea, and believe barter was mostly used for trade between total strangers (e.g. traders from outside your society or "economy")

Graeber's Debt: the first 5000 years covers this topic


> What is the best evidence for this historically?

Mostly archeological. There are many burials that contain items that were clearly not locally sourced. In some cases, they had to be transported for thousands of kilometers.

And quite often this was done for non-functional items such as jewelry or dyes.


Er... Money has existed for thousands of years, and has replaced barter in any society with even moderate amounts of specialization, and a population size that gets into the thousands. In Roman times, this was already the case for thousands of years. Money is one of the great enablers of trade and specialization, of empire building. Barter economy cannot sustain any of that, because barter economy does not scale. Money is a relatively recent invention in the time scale of our species existence, but that's still 3-4 thousand years of near-ubiquitous use, minimum.


Was there money in pre-1778 Hawaii? Not that I have been able to figure out. I believe it was a gift economy.

There certainly was specialization in Hawaii, and with a population of over 100,000 would seem like a good counter-example.

> Barter economy cannot sustain any of that, because barter economy does not scale.

From https://en.wikipedia.org/wiki/History_of_money , "There is no evidence, historical or contemporary, of a society in which barter is the main mode of exchange;[23] instead, non-monetary societies operated largely along the principles of gift economy and debt."

https://en.wikipedia.org/wiki/Non-monetary_economy#Other_mon... list other money-less systems including "the Incas and possibly, also the empire of Majapahit". Both were empires.


Hawai'ians had a number of skilled trades, and a caste system where a working class served a ruling class. Taxes were levied by the higher classes, and were paid in an amount based on the unit of land that was worked via subsistence farming. Those taxes were paid in the form of material goods (textiles, livestock, agriculture, etc). The material goods were the medium of exchange used to pay back a debt.

Therefore, Hawai'i did have money, in the form of commodity money (objects having intrinsic value in addition to value as a method of payment), which is distinct from barter in that there are specific recognizable units of exchange (specific amounts of commodity money used to pay a specific amount of debt). Material goods were also used as money for trade between islands.

[1] https://evols.library.manoa.hawaii.edu/server/api/core/bitst... [2] https://en.wikipedia.org/wiki/Ahupua%CA%BBa [3] https://www.nps.gov/parkhistory/online_books/kona/history1g.... [4] https://web.archive.org/web/20140605052446/http://www.hawaii...


When you say "taxes", I worry that your are imposing your view on the system.

Your [1] starts "The concept of private property was unknown to ancient Hawaiians" and says:

> Many Native Hawaiian scholars today make a distinction between the annual exchange before and after written tax law. Ho‘okupu, the term used for the exchange before written tax law, is similar to ‘auhau, the term used after written tax law was instituted. Both refer to the requirement to provide labor or a portion of an individual’s labor production to a governmental agent, but as noted earlier, ho’okupu literally means “to cause to grow.”

> Some Native Hawaiian scholars believe that ho‘o kupu and tax are antithetical ideas, because, they argue, ho‘okupu was generated by the person who gives, while taxes were demanded from the person or group that receives.

Your [3] points out "Actually because the chief upon whose lands they lived owned all the land and resources in an ahupua'a, in a sense the tenants were only giving these resources to the rightful owner, in a useful form and upon demand, on a gift-tax basis."

If you own everything, how do you tax it?

If you own no private property, what does it mean to tax it?

Animals exchange goods - does that make it a monetary system? Eg, "Reciprocal Trading of Different Commodities in Norway Rats" at https://www.cell.com/current-biology/pdf/S0960-9822(18)30003... .

Why does Wikipedia list non-monetary cultures?


Every article I found on the history of the practice stated they were taxes. They were called "tributes" at the time, but the practice of landed gentry getting payment in exchange for protecting (or not hurting) you is a form of tax, regardless of whether you're doing it willingly or not. Even paying tribute to the gods is a form of tax, because people are afraid that if they don't pay tribute, the gods will be angered and destroy their crops. Tax is really just materials the powerful use to rule or control.

Property ownership is not inherent to taxation, there are many forms of tax.

There's no reason animals can't have a monetary system. We are animals after all, even if some people like to pretend we're not. https://en.wikipedia.org/wiki/Prostitution_among_animals


See, this is why I think you are interpreting through a very specific lens, and I am not convinced it applies.

"Landed gentry", for example, is a particularly British was of looking at things. Your source [1] says "Using a feudal metaphor that many Native Hawaiian scholars reject today, Richards described the problems with several layers of chiefs, all of whom could demand ho‘okupu." (https://en.wikipedia.org/wiki/Feudalism notes issues in extending concepts from feudalism to other cultures).

I know in ethnology there was a long history of viewing everything through a Western European structure, even when it disagreed with the data. It's taken ethnologists a long time to pull back some of those blinders. From what I understand, ethnologists are often annoyed at economists who keep using outdated ethnology. (For a traditional example, the idea that before money there was only barter, when no culture has ever been shown to be based on a barter economy.)

Why then should I not trust the Native Hawaiian scholars who presumably have a better understanding of the topic and say this was neither a tax nor feudal?

> Even paying tribute to the gods is a form of tax, because people are afraid that if they don't pay tribute

Yes, squint hard enough and anything can be tax.

If a husband and wife decide to merge incomes, with the wife deciding how the money will be spent, that could be seen as a 100% tax on the man's income.

(Yes, either one could decide to not continue this arrangement. The materials you points to also highlight that Hawaiians were not bound to the land, and could move should the chief not be to their liking.)

If a skilled slave is sent to do work on another estate, and the slavemaster profits from it, giving the slave only room and board, that could also be seen as a tax, yes?

But it doesn't seem like a useful way to describe either relationship.

Which is why the "Prostitution among animals" gives alternatives, like "The researchers speculate about the possible genetic fitness advantages and disadvantages of the practice, and aren't altogether sure that the female copulates mainly in order to obtain a stone" and "females within the meat-sharing community tend to copulate with males of their own meat-sharing community. Direct exchange of meat for sex has not been observed", with only a single example of the latter exchange among capuchin monkeys.

Or from my link, using the phrase "reciprocal altruism" instead of "monetary system"?

Is "reciprocal altruism" always the same as "monetary system"?

FWIW, I entered this thread to respond to kspacewalk2s assertion about "money", at https://news.ycombinator.com/item?id=38060762 , not "monetary system". According to kspacewalk2s, money is required to have trade and specialization for any culture beyond a few thousand people. I think you agree that Hawaiians did not have "money" before European contact, correct?


The Roman Empire was basically modern. It had currency, banks, loans with interest, etc.

At the same time, the Slavic countries up north still were pre-monetary. There was little to no currency, but there was extensive trade in fur, salt, and other goods.


> for thousands of years

It depends on how you define money. Coins didn't really exist until the 7th century BC, that doesn't mean long-range widescale trade did not exist prior to that for 1000+ years but they didn't generally use money (in the way we would understand it at least) so the boundary between using money and barter wasn't really that clear.


This is highly dependent on the location. We certainly know there was a large interconnectede monetary economy around the middle east and the mediterranean around 3000 years ago. The roman empire was largely a monetarian economy as well, about 2000 years prior to what is commonly referred to as "pre-industrial", they had quite an extensive banking system as well.


Good paper from the progenitor of the blockchain, Nick Szabo, positing that the first moneys emerged up to 75,000 years ago and possibly enabled Homo sapiens sapiens to supersede Neanderthals:

https://www.fon.hum.uva.nl/rob/Courses/InformationInSpeech/C...


I seriously doubt that currency (a standardized medium of exchange) existed in prehistoric times. But barter economy certainly did, we have plenty of archeological evidence for it.

Still, even the barter economy was used for mostly "optional" activities. People were not dependent on it for survival, a tribe could live just fine on their own, without trade.


> currency (a standardized medium of exchange)

That's a pretty recent innovation, standardized coins didn't appear until the 600s BC, barely 100-150 years or so prior to the Greco-Persian wars. Widescale international trade existed for 1000+ years prior to that as far as we know, you don't necessarily standardized money for that.


Large scale international trade relied on standardized forms of money for thousands of years before the first coins were minted:

https://www.pnas.org/doi/10.1073/pnas.2105873118


I'm not sure why you're describing 600s BC as "recent"?


Because we know that there were much older civilizations during the bronze age which relied on long distance international trade and at least their ruling classes were dependent on it for their survival.


> Most of the economy back then was non-monetary.

In 600 BCE, Lydia's King Alyattes minted what is believed to be the first official currency, the Lydian stater. The coins were made from electrum, a mixture of silver and gold that occurs naturally, and the coins were stamped with pictures that acted as denominations.

https://www.investopedia.com/articles/07/roots_of_money.asp

But also note that physical currency is not necessary for “money”. Money has been around for about 5000 years, ridding us from barter.


They used money. The Incas had a system for accounting using knots https://en.m.wikipedia.org/wiki/Quipu

You don’t need physical coins to have money


Money and debt aren't exactly the same thing though right? The quipu is a system of IOUs iirc. More like the English debt stick. With currency/money (gold, silver, copper, fiat notes), we make a transaction on the spot and we're done. There is no debt in the simple case. The poster is saying they didn't use money and it sounds like they didn't. They used a system of tracking debts which could likely be traded.

I know it's all tightly related, but I believe there is a difference.


An accounting system using knots isn't money, per ser. These systems of credit were based on mutual trust and social relations, often without a physical representation of money as we know it today.

Comparing this to coinage, the innovation of coins introduced a standardized physical object that could represent value, which allowed for a different kind of economic activity not solely based on personal trust and relationships. Coinage enabled transactions with strangers and facilitated trade over larger distances and among larger groups of people, where personal credit relationships were not feasible.

Money, has a specificity to it. In essence, while early credit systems were based on social relationships and trust within communities, coinage represented a more impersonal and widely accepted medium of exchange that did not necessarily rely on social bonds. This distinction is crucial because it allowed for the expansion of trade and the concept of money as an abstract unit of account, rather than a direct reflection of social debts and credits.


Money-as-knots-in-a-rope sounds closer to the modern money-as-bits-on-a-plate than does money-as-metal-disks.


indeed, the comparison of quipu to modern digital money highlights the diversity of forms that 'money' can take. However, the fundamental difference lies in the functions and roles that these systems serve within their respective societies. The quipu was primarily an accounting tool, part of a complex system of record-keeping used by the Incas, which facilitated the administration of their economy, particularly in terms of tribute and state resources. It did not serve as a medium of exchange in the same way coins or modern digital money do.

modern money, whether digital or physical, serves several key functions: it is a medium of exchange, a unit of account, and a store of value. While the quipu certainly functioned as a unit of account, it's not clear that it served as a medium of exchange or a store of value. These are essential characteristics that define 'money' in the economic sense.

the impersonal nature of coinage and modern digital money allows them to facilitate trade and economic activity on a scale and with a degree of anonymity that's not possible with a system like quipu, which is deeply embedded in the social and political fabric of the society that uses it.

The transition to coinage and later to digittal transactions represents a move towards a more standardized, divisible, and portable form of money that can be used in a wide range of transactions, with or without a pre-existing relationship between the parties involved. This is quite different from the quipu, which was embedded in a specific cultural context and may not have been readily exchangeable or understood outside of that context.

So while it's tempting to draw parallels between ancient accounting systems and modern digital currencies, we must be careful not to conflate the two. Each serves its purpose within its particular economic and social milieu, with specific attributes and limitations that define its use as "money."


> It did not serve as a medium of exchange in the same way coins or modern digital money do.

Source for the confidence here? We know that a corvée economy existed, but I’m skeptical that we can rule out private quipo-based exchange. The evidence base is pretty thin; a lot of stuff didn’t survive Pizarro.


https://www.peruforless.com/blog/quipu/ https://en.wikipedia.org/wiki/Quipu?useskin=vector they were more like ledgers or logs... not money. (Early databases perhaps?)


> Coinage enabled transactions with strangers and facilitated trade over larger distances and among larger groups of people, where personal credit relationships were not feasible.

Extensive trade international trade networks existed during the entire bronze age and the preceding periods without any coins, though. Coins are useful as an standardized accounting unit and are easy to transport but fundamentally are not that different from barter.


I think there is a lot of fantasy thinking that ancient times didn’t use money. Trade is evident from the earliest times as proven through goods at burial sites that originated thousands of miles away. Trade necessitated commoditized assets as intermediary value stores, and common ones included salt and furs in addition to hard metal coins and commoditized metal objects like swords.

Social relationships are still important the higher you go in finance - it’s much easier to get a $100 million loan for a new building with a strong relationship with a banker than as a stranger, regardless of collateral.

I think a pre-commercial time where people didn’t care about money is a fiction.


There's a lot of anthropological and archaeological evidence to the contrary. People indeed had trade and exchanges in ancient times, but these did not aalways necessitate a formalized system of money as we understand it today. The early forms of trade were often based on complex systems of credit and debt that were deeply intertwined with social relationships and trust within communities. David Graeber's work, "Debt: the first 5000 years," highlights that for more than 5,000 years before the invention of coins, humans extensively used such credit systems to buy and sell goods, long before the existence of coins or cash.

While it is true that trade is evident from ancient times, with goods found at burial sites that originated thousands of miles away, this does not automatically imply that all trade was facilitated by a commoditized asset serving as a universal medium of exchange. In many cases, goods like salt, furs, and metal objects were indeed used in trade, but they were part of a broader system of barter and reciprocal exchange, which could function effectively without a standardized form of money.

Regarding the role of social relationships in finance, while it's accurate that relationships remain crucial, especially for large transactions in modern times, this does not discount the fact that in the past, community trust and social bonds were often the primary means of securing credit, not collateral or commoditized money. This is evident in how competitive markets and the scarcity of trust can affect transactions, as Graeber notes through an anecdote where mutual aid within a community was a given, not a transaction requiring formal repayment.

The idea of a pre-commercial time where 'people didn't care about money' may indeed be fictional, but it's more nuanced than simply saying they used money in the way we do now. They cared about value and exchange, but these were frequently managed through social mechanisms rather than through impersonal, commoditized money. It's essential to understand that the concept of money has evolved and that early forms of trade and credit were valid economic systems in their own right, even if they don't match the monetary systems we are familiar with today.


My broader point is that certain people think that there is this utopian “pre money time” where capitalism didn’t exist. I believe capitalism is the default, free trade is the default, and the fundamental idea that people will engage in for-profit commerce is embedded into our psychologies.


capitalism, as a system defined by profit-driven markets and private ownership, is a relatively modern concept and not the default economic state throughout human history. earlier societies often operated on principles of reciprocity and communal sharing rather than for-profit trade. while the inclination to trade can be considered inherent, the forms and rules of trade have varied greatly across cultures and eras, shaped by differing social and political contexts.


> is a relatively modern concept and not the default economic state throughout human history

This is speculation.

> earlier societies often operated on principles of reciprocity and communal sharing rather than for-profit trade.

Any society that had specialization of labor did more than that. Heck, the American Indian tribes measured their individual wealth via horse ownership. They certainly engaged in trade with the intent of profit.


horse ownership? that was after european contact, right?


Yes. The first thing that happened with european contact was trade. I don't believe they were unfamiliar with it.


That’s not speculation. Capitalism is more that just trading for profit. https://en.wikipedia.org/wiki/Capitalism


From your link:

"Capitalism is an economic system based on the private ownership of the means of production and their operation for profit."


exactly, "private ownership of the means of production...." that's an important distinction. It also says, "Central characteristics of capitalism include capital accumulation, competitive markets, price systems, private property, property rights recognition, voluntary exchange, and wage labor." Literally the next sentence. I'd rather not argue the very well understood definition of Capitalism.

You can't have wage labor unless you have money as a construct.


> Literally the next sentence

That sentence just clarifies what the first sentence implicitly requires, as you cannot trade for profit unless you have property rights, etc.


They paid people in ancient times. They knew what a typical day's wage was in Athens:

> Also during Pericles' tenure, pay for civic service was instituted. No single other reform furthered democracy as much as pay for service. Now many more people could afford to serve, and for some, serving became attractive financially. First, dicasts, or jurors, began to be paid. A low rate, but half a day's wages or so. That was introduced by Pericles while Cimon was still around, perhaps to counteract his liberality with his own wealth. Jurors were appointed by lot annually and could serve year after year. By 422 (Aristophanes Wasps 662), there were 6,000 jurors per year. Cleon increased the pay rate to 3 obols a day. Pericles also started the payment of soldiers and sailors 3 obols a day.

https://www.uvm.edu/~jbailly/courses/clas21/notes/atheniande....

You have a very formal, academic understanding of capitalism while we are arguing that the fundamental ideas of capitalism naturally exist because that's logical for human psychology - owning things, trading, markets, paying people for labor. This is all stuff that goes back to the earliest recorded records. Cuneiform tablets are just endless accounting ledgers.


....so what did these, soldiers, sailors, and jurors produce?


They provide services. What does a cashier "produce" at the grocery store? There were also artisans of all sort that did produce finished or intermediary goods - brickmakers, pottery makers, blacksmiths, weavers, animal breeders, and on and on and on.


the athenian system, as per the UVM article, doesn't align with modern capitalism. It had democratic features like paid public service roles, but was primarily an oligarchy, not a free market. economic participation was broadened, but the structure remained class-based, with power concentrated among the elite.

It's economy wasn't modern free market system focused on capital accumulation and investment for profit. It was a mixed economy with significant state involvement and a variety of revenue sources that went beyond simple market transactions.

State actors paying their civil servants isn't evidence of capitalism, or wage labor.

( David Graeber does write about ancient Greek city-states and how their coinage came to be, according to the historical and archeological record, in "Debt: The First 5000 Years," by the way. )

Capitalism, is a relatively modern phenomenon, with a pretty common, well understood definition. This isn't a heretical or radical idea. You may find some societies prior to the 16th/17th century that fulfills some characteristics of capitalism, but they don't make the cut.

People naturally want to trade. Markets existed before capitalism and they will exist after capitalism.


> doesn't align with modern capitalism

This smells like "socialism works fine it just that all the failed attempts were doing it wrong".


false equivalence.


Capitalism is about accumulating resources privately, and then using those accumulated resources to invest in earning more resources.

If you believe that early man was capable of thinking, "This shit is my shit, that shit is your shit, and if you take my shit I will punish you", then you have the precursors for capitalism. Academics like to throw a bunch of other nonsense around but that's it.


Of course knots aren't "money" because money also needs scarcity and a way to prevent forgery but we have plenty of other examples: Rai stones, cowrie shells, other rare things ...


So they invented the modern fiat monetary system before it was cool?


They certainly used _accounting_, but not money (currency). They were not assigning certain monetary value to items.


I think it’s highly likely that a system built for counting was used for counting loans, debts, and resources. The foundation of civilization is resource allocation.


I'm not an expert in this so maybe am just off base.

But the key difference (I have been told) is what you can do with that accounting.

Like, I can walk into a shop and buy anything on the wall with money, whereas that kind of accounting may have very different implications for what you can do with it.

Additionally, I can take money that I gathered from one source and use it somewhere else, and it's fungible in that I can use it anywhere else in the system. If I have a debt to one person in earlier systems that debt may be non-transferable.

If those two elements are true, it becomes very difficult to do a lot of the things that we think of as money, specifically interest and massive accumulation.


It assumes that Inca used formal loans and debts. They certainly used accounting for resources, though.


It is fun to think of knots as rudimentary Merkel trees!


> how did all these countries manage to find enough gold

They may not have. Some gold mixed with silver or other metals may have been common. In other words counterfeiting whether officially sanctioned or by thieves was probably not uncommon.


I have a ‘silver’ Roman coin, but it’s a thin layer and it’s bronze underneath.


Rome certainly did that at different points when funding was needed


> Was it all relatively surface level and rapidly mined out, and now all gone?

Pretty much. Elemental gold or relatively easy to refine alloys were stripped off the land over many thousands of years. Now we have to go deeper to find more.


There's plenty of gold, or at least there was, in the south west of Britain back in the day. I'd assume pre Roman gold was all surface level stuff, but there were stories of way more industrial processing of silver by the Romans in Spain.

https://www.jstor.org/stable/296070#:~:text=The%20silver%20m....



Remember that issuing coins was one way to demonstrate that you really were the sovereign, also. So getting a bunch of gold together and minting some coins was good for "show of authority" and marketing purposes.


Plus if you run into financial trouble, you can debase the currency for a while before people catch on. I think somewhere I read that this eventually happened to most ancient-world currencies, except the Venetian currency (Venice being run by merchants).


There was gold mining in Britain (specifically Scotland) at least 2,500 years ago [1].

[1]: https://www.bullionbypost.co.uk/index/gold/gold-mining-in-th...


It may not be on the same scale of other countries, but the UK has gold deposits which have been mined for quite some time.


I would guess these currencies took over gradually. And they never really disappeared. So it got more and more.


more or less yes. Everything that was easy to find and extract largely has been.


I've never heard of a gold mine in the UK, but Welsh gold is something you often see and hear of, whatever that tells you


lol age of empires man. There was gold just sticking out of the ground everywhere. You just needed some pleebs to mine it and carry it back to your keep. or was that stronghold? I forget XP


I think JAX is cool, but I do find it slightly disingenuous when it claims to be "numpy by on the GPU" (as opposed to PyTorch), when actually there's a fundamental difference; it's functional. So if I have an array `x` and want to set index 0 to 10, I can't do:

  x[0] = 10
Instead I have to do:

  y = x.at[0].set(10)
Of course this has advantages, but you can't then go and claim that JAX is a drop in replacement for numpy, because this such a fundamental change to how numpy developers think (and in this regard, PyTorch is closer to numpy than JAX).


Agree, though I wouldn’t call PyTorch close to a drop-in for NumPy either, there are quite some mismatches in their APIs. CuPy is the drop-in. Excepting some corner cases, you can use the same code for both. E.g. Thinc’s ops work with both NumPy and CuPy:

https://github.com/explosion/thinc/blob/master/thinc/backend...

Though I guess the question is why one would still use NumPy when there are good libraries for CPU and GPU. Maybe for interop with other libraries, but DLPack works pretty well for converting arrays.


Why is that? Why doesn't Jax just do something like

    class JaxWrapper:
        def __init__(self, arr):
            self.arr = arr
        def __setitem__(self, key, val):
            return self.arr.at[key].set(val)
        ....


On the other hand, if I wanted some scientific NumPy code to run on the GPU, I think rewriting it in JAX would probably be a better choice than PyTorch.


In my experience, the answer comes down to "does your code use classes liberally?"

If no, you're just passing things between functions, then go ahead with Jax! But converting larger codebases with classes is just significantly better with PyTorch even if they use different method names etc.


I'm going to disagree here! Classes and functional programming can go very well together, just don't expect to do in-place mutation. (I.e. OO-style programming.)

You might like Equinox (https://github.com/patrick-kidger/equinox ; 1.4k GitHub stars) which deliberately offers a very PyTorch-like feel for JAX.

Regarding speed, I would strongly recommend JAX over PyTorch for SciComp. The XLA compiler seems to be much more effective for such use cases.


Sorry for my potentially VERY ignorant question, I only know functional programming at average joe level.

Why can't you do the first in functional programming (not in this specific case because it's just how it is, but in general)?

And even if you can't do so for any reasonable reason in functional (again, in general), what stops us to just add syntactic sugar to equal it to the second to make programmer's life easier?


There's 2 different aspects people mean when they call sth functional programming:

- higher order functions (lambdas, currying, closures, etc.)

- pure functions, immutability by default, side effects are pushed to the top level and marked clearly

The first aspect of functional programming has been already accepted by most OOP languages (even C++ has lambdas and closures).

The second aspect of functional programming is what makes it useful on GPU (because GPU architecture that makes it so powerful requires no interactions between code fragments that are run in parallel on 1000s of cores). So you can easily run pure functional code on GPU, but you can't easily run imperative code on GPU.

You can introduce side effects to functional programming, but then it ceases to be any more useful for GPU (and other parallel programming) than imperative/OOP.


The fundamental reason why many functional languages won't allow you to do the first is that they use immutable data structures.

We could indeed introduce syntactic sugar (`y= (x[0]:=10)` maybe), but you'll still need to introduce a new variable to hold the modified list.


It's my understanding that, at least in Python, you can't change immutable data type but you can just assign a new data to the same variable and therefore overwrite it, right? So even if JAX makes list type immutable, you can still just re-use `x` to save the new modified list.


doesn't `[] =` just call a method on the object in python?

e.g, `x[0] = 10` is the same as `x.__set_item__(0, 10)`, so there shouldn't be any technical limitation to using `x[0]` (says the guy who never even imported jax)


You could do `y = x.__setitem__(0, 10)`, but you cannot assign `x[0] = 10` to a new variable. If `__setitem__` was overridden, you would not be able to distinguish between these cases and raise an error in the second one.


Yes, that makes perfect sense.

I somehow completely missed the assignment part of the second example.

Thank you for the clarification.


Also conditionals can be tricky (greater, if else) and often need rewriting.


You can see regularly in practice where aggressive data augmentation is used, which obviously is only used on training data. But, of course, you'd still be 'overfit' if you fed in unaugmented training data.


It's a problem when it happens suddenly because you end up with a 'top heavy' age distribution. The elderly pensioners:young earners ratio explodes and society security/national insurance cannot afford healthcare/pensions etc.

This is compounded by what was (until recently) a steady increase in life expectancy.

The really frustrating aspect of this is pensioners, who have traditionally huge voter turnouts during elections, are unlikely to have their benefits cut, because it would be political suicide for a party. The elderly believe they have "paid in" to their plans and deserve them, but outside of private pensions, this is usually a significant over-simplification. Instead, what will happen is the working young will need to pay for the generous promised benefits that the elderly are receiving, whilst 'paying in' for much lower benefits for themselves.


This is only disturbing for people who think human work creates the economic value we need. While in reality most of our economic value is created by machines. We don't feel it because it's mostly captured by billionaires so it seems that we still need to work as much as we used to.

So the future with less workers is just a future with poorer billionaires and they don't like that which explains why all the media owned by them are trying to scare us about againg population scenario.


I love Nim so much. I really hope some big company decides to use it as their workhorse language - it feels like the only thing holding it back is the lack of corporate 'buy in'. Most similarly good languages (except F# maybe?) seem to get lucky with becoming a poster boy for at least one company.


>Most similarly good languages (except F# maybe?) seem to get lucky with becoming a poster boy for at least one company.

This one I still can't wrap my head around. F# basically was the source of almost all innovation in C# and C# still is missing some features and was not designed in the same way as F# so a lot of the features they "stole" feel tacked on.

Microsoft should have just made F# the C# successor.


> Microsoft should have just made F# the C# successor.

You underestimate the importance of compatibility and familiarity for existing developers. You can show C# code to any C++, Java or even JS programmer and expect them to grasp the idea of what's going on very quickly. This is a big deal and should not be discarded lightly


In terms of looks mostly it just doesn't have as many types and fewer parentheses. Shouldn't be an issue if you know python.


In my experience, Algebraic types alone take some work to get used to using competently. That's just one F# uniqueness and there are others. Learning Elm (a similar language) definitely made me a better engineer, but I really don't think learning F# is the same as learning Python for someone with experience with imperative/OO languages.

I think the reason it sometimes seems like it is that easy is that, as we gain experience with tools, we tend to forget the difficulties we had learning them.


I had difficulties learning OOP, I always preferred a more functional style, despite "consensus" needing 20 years to catch on that functional programming alleviates a lot of the issues we have with OOP.

Learning F# it instantly clicked.

OOP learning issues IMHO are intrinsic to the style, because it's just so seldom helpful in making programming easier.

Design patterns are just a symptom of this issue. That OOP is just a misunderstanding another.

"Real" OOP as practiced in Erlang/Elixir is quite useful.


I think they are making F# the successor but very slowly by evolving C#. I would also have preferred the direct path.


I had a brief look at F#, it didn't fit my mental model and I looked away. Meanwhile I do like using the tacked-on functional features in traditional languages because it allows me to get the benefits when it fits what I'm doing instead of having to fully buy in.


You're just using the tacked on features as syntactic sugar. You haven't "bought in" at all.

I was traditionally trained like you under OOP and procedural methodologies.

After encounter FP I fully bought in and developed two mental models. The fp model lives side by side with the the oop/imperative model.

With equal knowledge in both one can make a more unbiased judgement. The fp model is actually superior imo.

I largely have the opposite strategy now when programming. My mental model is largely fp, and I occasionally cheat and sprinkle in procedural or oop syntax here and there as syntactic sugar.

The basic realization here should be that mutating shared state should be avoided and segregated as much as possible.


> syntactic sugar

To be fair, all languages are syntactic sugar over asm which is s-s over CPU machine code. A mental model is just that: a model for something physical that's in your head. FP or OOP are both mental models for programming in nearly any language. Some languages have first-class FP or OOP features, and some don't, but you can tack-on either of these, and others, to nearly any language you want to. It's really dumb to look down on others choice of mental model if it works for them. I have seen successful projects that were designed around both OOP and FP models.


Garbage collectors, type systems, modules, polymorphic functions are not syntactic sugar. Asm also isn’t syntactic sugar, if only for the polymorphism (how many different CPU instructions are MOV in Asm again?); or just the fact that it’s text and machine code is binary code, while desugaring code doesn’t change its language.


F# is such a nice language. A great modern ML language on the .net runtime with full MS package support that's pretty awesome. But unlike something like Scala on the JVM F# is so simple and straightforward, it has a lot of features without feeling bloated. The fact that MS has just abandoned F# is disappointing.


> Microsoft should have just made F# the C# successor.

their target audience does not want this



he said he works on some internal tool, but not clear how wide is adaptation outside of that.


Yup, corporate buy in is a big deal. Years ago I tried using Nim for a network daemon, but the sheer lack of even the most basic http libraries prevented me from doing it and I went with go.

The library situation is a bit better now. But without a big company contributing in terms of libraries, the language usage will just be very low.


Nowadays atleast Nim has multiple compenent http libraries, see https://nimble.directory/search?query=http


Not in that list, but nim-chronos also provides a solid http server/client:

https://github.com/status-im/nim-chronos


Ditto! Nim is just easy and incredibly fast (to develop and the code!). Our code base is Nim based - even normal bash script have been replaced!


What app are you developing with it?


A full project management platform for construction projects. Webserver, websocket, microservices, etc.: https://cxplanner.com


While we’re not a big company, my work uses it as our secret weapon for embedded firmware development.


If only it supported RV32IMAC and/or XTensa one could use it on one of the ESP32 variants.


Nim supports both since it compiles with pretty much any C89 C compiler. Also https://github.com/elcritch/nesper :)


Oh nice! This looks very tempting…


Feel free to drop by Discord #embedded if you're tempted enough.. ;) I've had production devices running on it for years.


We use the ESP32-S3 as our main microcontroller!


Do you do FPGA designs with it, or just bare metal software?


Software on microcontrollers :)


Programming language adoption -- like the adoption of many technological products -- is usually really fast. With the possible sole exception of Python, all programming languages, popular and unpopular alike, have reached the general ballpark (i.e. high/mid/low) of their all-time peak market share within 5-10 years. Dark horses are very, very rare.


C++ was designed in the late '70s-early 80's and reached peak popularity in the mid-to-late '90s. JavaScript appeared in the mid-'90s and didn't see broad adoption until about the '10s. I would say that the norm is for peak adoption to happen at around the 10-15 year mark of original release. Languages that get adopted faster than 7 years are unusual, and it indicates that they came out at highly opportune times to address specific needs. I'm thinking in particular of Java and Rust, for example, which both reached critical masses of adoption very quickly.


C++ was released in 1985 and it neared its peak market share by much closer than a ballpark in its first decade. JS did the same.

> Languages that get adopted faster than 7 years are unusual, and it indicates that they came out at highly opportune times to address specific needs.

Not only is it not unusual, over hundreds of languages I think there has really been one exception (for 10 years; maybe not 7). At age 10 how well a language does is more or less how well it's ever going to do. I am not saying this is a prediction, but it has been the case historically with almost no exceptions. Any language has the chance to buck this trend, but it has been the trend.


JS took well over 10 years to reach massive popularity. C++ took a bit over 10.


That's a definite no on either one. JS was very popular (top 5-10) in 2006 and C++ was very close to its peak popularity in 1995 (and already declining five years later). Maybe Ruby took 11 years, but that's close enough. Over scores or even hundreds of languages, I think Python is the only clear exception. It wasn't unpopular in 2005 (it was in the top 20), but it certainly moved up a rank from middling popularity to super-popularity well after 10 years old.


I can't see how there should be a reliable pattern in language usage like that. Javascript is more used that ever - who knows were the all-time market share peak is - but we're well beyond your 5-10 year time frame. Could you name any languages that actually have this pattern? Julia maybe.


All languages, except possibly Python, have followed this patterns. Read my comment carefully: I said that virtually all languages reach their ballpark market share within 10 years. JS did reach its high market share ballpark within a decade.


Essentially nobody was a "JavaScript Developer" in 2005, 10 years after JS's release. JQuery didn't exist yet, the term "AJAX" hadn't been coined, and NodeJS was years away. Web developers were using JS to enhance pages, but it wasn't a language that was widely used to write applications. I don't know what the popularity numbers looked like so I'm not saying you're wrong, but it was definitely closer to 15 years before people started to take JS seriously.

JS aside, I think there's probably some survivorship bias going on here. I don't think there are a lot of 15 year old, relatively unpopular languages that are still under active development. Maybe it's not that languages that don't become popular in 10 years never will, but rather that languages that don't become popular in 10 years tend to be abandoned by their developers, thus sealing their fate.


JS was a very popular language in 2006 (at least in the top 10 if not top 5) -- a lot of people were doing at least some JS development -- even though it grew bigger later. Nevertheless, JS (like Objective-C and Swift) is indeed special in the sense that it's the "monopoly language" for a particular platform.

> I don't think there are a lot of 15 year old, relatively unpopular languages that are still under active development.

There are quite a few. If you look in virtually any language ranking, places 10-40 have many 10 year-old and even 15 year-old language. Here are some continuously developed >=15yo languages that are less-than-middling-popular and have always been so: Common Lisp, Racket, Clojure, OCaml, SML/NJ, F#, TCL, Haskell, Idris, Groovy, Squeak, Erlang, D, Ada, Nim. There are more of these than there are super-popular languages. (I didn't include any language that was, at one time, at least somewhat popular but is no longer, such as Visual Basic, Delphi, and Perl 5).

There might still be survivorship bias, but I am not saying all that is fate, just a clear historical observation.


I understand what you are writing. But it only holds for you because your idea of ballpark is exceedingly wide. It's doesn't hold at all for the usage numbers I look at without your socalled ballpark being several magnitudes.


JS's market share did not grow by even one order of magnitude since ~2006 nor even close to an order of magnitude. There aren't that many orders of magnitude for a share to grow if you start counting at ~1%.


In general you are correct. However, language adoption can skyrocket when some new library/framework arrives which has a compelling reason to switch (e.g. Rails for Ruby)


ocaml managed, but it took a significant effort to improve the general tooling ecosystem


Unfortunately, it's too good of a language for serious corporate adoption...


Companies base their choice of tools on how easy is to find developers to hire, therefore Nim and other similarly young but very powerful/optimized languages (Crystal, Zig, ...) need many independent developers, who can base their choice purely on technical merits and personal taste, to use it and put it into their resumes before the corporate world can notice and add them into their list of accepted tools. It's a quite slow process though.


The main problem is people afraid about Style Insensitivity it enables by default .


Its set to produce a compiler "Warning" by default now.


(for inconsistent usage)

As background, style insensitivity was introduced so that codebases can use a consistent camelCase or snake_case regardless of the style used by upstream libraries.


Which as an aside, makes writing and using C bindings so lovely.


Well, maybe not writing. Occasionally low-level C libraries - especially those that deal with keyboard input - decide to provide identifiers differing only in case... There's a WIP RFC for providing a way to deal with identifiers that need to be verbatium by surrounding them with backticks, though.

https://github.com/nim-lang/RFCs/issues/477


I am in a position to do this and I picked kotlin instead as the tooling is not good enough yet. Without a good IDE experience you lose some of the productivity that nim potentially offers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: