Hacker Newsnew | past | comments | ask | show | jobs | submit | tealcod's commentslogin

Would it be possible to run something like vLLM or TensortRT-llm with tinfoil?


We’re already using vllm as our inference server for our standard models. We can run whatever inference server for custom deployments.


As a user, can I host the attestation server myself?


All attestation verification happens client side. We have verifiers in Python [1] and Go [2] (which FFIs to our other SDKs like WASM and Swift). We push all the verification logic to the client so the verification process is entirely transparent and auditable.

[1] https://github.com/tinfoilsh/tinfoil-python [2] https://github.com/tinfoilsh/verifier


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: