I built a free, zero-knowledge GUI for your LLM API keys - API2CHAT

Discussion in 'Programming' started by pesst, Apr 3, 2026 at 2:18 AM.

  1. #1
    Hi guys,

    I’ve been working on a lightweight (under 9KBs) side project called API2CHAT and would love to get some feedback from the developers and server admins here.

    I wanted to create a universal chat GUI for LLM APIs (OpenAI, OpenRouter, DeepSeek...) that requires absolutely zero backend architecture and stores no data. It's a vanilla HTML/JS/CSS. Because there's no server-side code (no need PHP, Python, Node.js...), it's totally private—keys and conversations stay in the browser (so it will work in any OS: Windows, Linux, Android, iOS...). Optionally, you can click "Flush session" for removing any trace and start a new clean session.

    It has a clean dark-style layout:

    [​IMG]
    [​IMG]
    [​IMG]
    It can be deployed locally by simply extracting a zip in any device with internet and a browser but it will even work in any low-end shared hosting like NameCheap with no config.

    - Repo: https://github.com/PacifAIst/API2CHAT
    - Live Demo: https://pacifaist.github.io/API2CHAT/
    - License: Apache 2.0

    Appreciate any feedback! :)
     
    pesst, Apr 3, 2026 at 2:18 AM IP