Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: A web-app to explore topics using LLM (github.com/charstorm)
20 points by graphitout on Dec 26, 2023 | hide | past | favorite | 3 comments
Lately, I've been tinkering with llama.cpp and the ollama server. The speed of these tools caught my attention, even on my modest 4060 setup. I was quite impressed with the generation quality of models like Mistral.

But I was a bit unhappy at the same time because whenever I explore a topic, there is a lot of typing involved when using the chat interface. So I needed a tool to not only give a response but also generate a set of "suggestions" which can be explored further just by clicking.

My experience in front-end development is limited. Nonetheless, I tinkered together a small web app to achieve the same goal. It is built with vuejs3+vuetify.

Code: https://github.com/charstorm/llmbinge/



Interesting concept, can you share some more detail about the implementation? How are you generating the different portions of the interface? Seems like you have a couple canned prompts that trigger a few exploratory ideas in addition to a primary response.



You are right.

- llm_generate() - this is the core function which calls ollama API

- get_related() - this will give a sequence of related topics for a given topic and description

- llm_get_aspect_query() - this is tricky. The interface has a fixed set of aspects for every response. Like history, related ideas, people etc. I wanted a way to create new query based on an existing query and a given aspect. This func does the rephrasing of the query.

- App.vue: handle_related() - there is a tiny bit here. When going to another "suggested" topic, it seems we have to give it some context.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: