A ceasefire without security guarantees is just a temporary pause for russia to rearm and continue on. None of the justifications for russia's invasion have changed, and this would mean the US will roll over for anything.
Sure; so maybe his focus should be on attaining those security guarantees. If not from the United States, then from other countries; is the United States the only World Police on the globe? Why is the Moral Onus on the United States to be the only ones willing to say "Yeah if you invade Ukraine again we'll put boots on the ground"; why isn't anyone asking Germany, or the UK, or Poland, to make those some guarantees to Ukraine? I mean, sure, there's genuine concerns about Article 5 if anyone in NATO makes those guarantees, I don't know what the impact of that would be, but looking past that for a second: Its utterly insane to me how much of this conversation involves placing some kind of Moral Responsibility on a country half the world away from the conflict.
> and this would mean the US will roll over for anything.
This is a war between Ukraine and Russia, not the US. Nobody should want western powers to enter this war because of the potential consequences. But if western powers make security guarantees, isn't that what you risk?
I think it doesn't need to be specifically text based, but given that LLMs are usually trained on primarily text (at least currently), I'm not sure they'd be meaningfully able to generate binary directly.
As for using DBs, that's certainly an option (i.e. langchain and such), but at some point you do still need to bring in the data inside the context, so I'd say it's still interesting to consider what would be an efficient way to represent that data via text.
I ran a small test comparing different data serialization formats for use with GPT models (and possibly other LLMs). This is obviously very limited but it was striking how much of a difference switching from JSON to something like YAML could be.
I wonder if we might also see LLM specific data serialisation formats in the future, to make use of tokenization in the most efficient manner and enhance the generative capability of the models.
How does this compare with https://www.tabnine.com/enterprise, which is also self-hosted, trained on permissively licensed repositories and supports training on private repos?
I use it for 3D printing mainly. It's pretty good, but 0.18.x had some annoying bugs (i.e. it would occasionally just crash, wiping everything since the last time you saved). Nothing show-stopper if you save often and don't do anything too crazy.
I personally prefer Fusion 360 (it feels sleeker and is way more usable with a touchpad), but it's definitely a viable choice.
reply