(tl;dr: live demo here, git repo here)
yt-chat is a tool designed to help you summarize any Youtube video
I am very happy to present you the result of the past couple days of work: finally a first release for yt-chat
!
yt-chat
is an LLM-based web app built with chainlit which helps you summarize any YouTube video. More than that, you can also chat and ask any question you want about the video.
yt-chat
is built as an Retrieval-Augmented Generation (RAG) LLM. It is very light in the sense that it does not require any typical, external RAG library such as llamaindex or langchain.
You can either use yt-chat
with your own OpenAI API key (if you wish to use gpt-3.5
for example)…
…or you can run it locally on your machine, which allows you to leverage local models such as mistral-7b
, using the powerful ollama
runtime library, which is a great alternative to llamma.cpp
(which we already discussed in a previous post). Docker support is offered (check our git repo!).
If you don’t want to bother, you can also have a look at our live demo!
(you will need to provide your personal OpenAI API key).
Enjoy the time saved summarizing long videos :)