Hacker News new | past | comments | ask | show | jobs | submit | xyc's comments login


The fact that there's no alternative implementation of SQLite also seems to play a part in preventing standardization of WebSQL.

https://www.w3.org/TR/webdatabase/

"The specification reached an impasse: all interested implementors have used the same SQL backend (Sqlite), but we need multiple independent implementations to proceed along a standardisation path."


I was completely unaware of that! How old is that document? I should reach out.


That effort died about 14 years ago.


damn =(


an opportunity for all of us to celebrate the astonishing power of necromancy


Indeed! This sort of thing is a problem. It's the same with Internet protocols: you need at least two implementations to get to Standard.


If anyone on macOS wants to use llama.cpp with ease, check out https://recurse.chat/. Supports importing ChatGPT history & continue chats offline using llama.cpp. Built this so I can use local AI as a daily driver.


You can get a release binary from https://github.com/ggerganov/llama.cpp/releases too.


Does it autoupdate? I get it from github so I just have to pull and build again every time I want to update it.


Yes, but that's not building it for your system, that's a relatively generic build.


Just tried out the puppeteer server example if anyone is interested in seeing a demo: https://x.com/chxy/status/1861302909402861905. (Todo: add tool use - prompt would be like "go to this website and screenshot")

I appreciate the design which left the implementation of servers to the community which doesn't lock you into any particular implementation, as the protocol seems to be aiming to primarily solve the RPC layer.

One major value add of MCP I think is a capability extension to a vast amount of AI apps.


Made tool use work! check out demo here: https://x.com/chxy/status/1861684254297727299


sharing the messy code here just for funsies: https://gist.github.com/xyc/274394031b41ac7e8d7d3aa7f4f7bed9


Superb work and super promising! I had wished for a protocol like this.

Is there a recommended resource for building MCP client? From what I've seen it just mentions Claude desktop & co are clients. SDK readme seems to cover it a bit but some examples could be great.


We are still a bit light on documentation on how to integrate MCP into an application.

The best starting point are the respective client parts in the SDK: https://github.com/modelcontextprotocol/typescript-sdk/tree/... and https://github.com/modelcontextprotocol/python-sdk/tree/main..., as well as the official specification documentation at https://spec.modelcontextprotocol.io.

If you run into issues, feel free to open a discussion in the respective SDK repository and we are happy to help.

(I've been fairly successful in taking the spec documentation in markdown, an SDK and giving both to Claude and asking questions, but of course that requires a Claude account, which I don't want to assume)


Thanks for the pointers! Will do. I've fired up https://github.com/modelcontextprotocol/inspector and the code looks helpful too.

I'm looking at integrating MCP with desktop app. The spec (https://spec.modelcontextprotocol.io/specification/basic/tra...) mentions "Clients SHOULD support stdio whenever possible.". The server examples seem to be mostly stdio as well. In the context of a sandboxed desktop app, it's often not practical to launch a server as subprocess because:

- sandbox restrictions of executing binaries

- needing to bundle binary leads to a larger installation size

Would it be reasonable to relax this restriction and provide both SSE/stdio for the default server examples?


Having broader support for SSE in the servers repository would be great. Maybe I can encourage you to open a PR or at least an issue.

I can totally see your concern about sandboxed app, particularly for flatpack or similar distribution methods. I see you already opened a discussion https://github.com/modelcontextprotocol/specification/discus..., so let's follow up there. I really appreciate the input.


A possible cheap win for servers would be to support the systemd "here's an fd number you get exec'ed with" model - that way server code that's only written to do read/write on a normal fd should be trivial to wire up to unix sockets, TCP sockets, etc.

(and then having a smol node/bun/go/whatever app that can sit in front of any server that handles stdio - or a listening socket for a server that can handle multiple clients - and translates the protocol over to SSE or websockets or [pick thing you want here] lets you support all such servers with a single binary to install)

Not that there aren't advantages to having such things baked into the server proper, but making 'writing a new connector that works at all' as simple as possible while still having access to multiple approaches to talk to it seems like something worthy of consideration.

[possibly I should've put this into the discussion, but I have to head out in a minute or two; anybody who's reading this and engaging over there should feel free to copy+paste anything I've said they think is relevant]



With Claude you barely had to learn the language this days as you just need to prompt, but SQLite column is an interesting idea.


If you are interested in no config setup for local LLM, give https://recurse.chat/ a try (I'm the dev). The app is designed to be self-contained and as simple as you can imagine.


Shameless plug: If you are on a Mac, check out RecurseChat: https://recurse.chat/

A few outstanding features:

- Fast: Import ChatGPT history, loads thousands of conversations at once.

- Floating Chat: spotlight / ChatGPT desktop app like floating window.

- Customization: You can add any OpenAI compatible API (including X.ai) as a model, and just edit the url/model id

- Chat with files: Not as complete as complete as a RAG solution for now, but we feature simplicity. Basically you can drag and drop PDF files onto a session or add files/folders to a model (like custom GPT) to start chatting.

- And, yes we support light mode! And several light mode code themes as well


Give https://recurse.chat/ a try - I'm the developer. One particular advantage over alternative apps is importing ChatGPT history and speed of the app, including full-text search. You can import your thousands of conversations and every chat loads instantly.

We also recently added floating chat feature. Check out the demo: https://x.com/recursechat/status/1846309980091330815


Regarding chat history, I've been thinking that people should have ownership over chat history. We are migrating the chats towards SQLite, so your data is going to be a timeless format like files* - SQLite has long term support through year 2050. https://www.sqlite.org/lts.html

* See [File over app](https://stephango.com/file-over-app).


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: