WeKnora is a Tencent-backed open source framework for deep document understanding and semantic retrieval, built to handle complex and heterogeneous documents through the Retrieval-Augmented Generation paradigm. The platform combines multimodal preprocessing, semantic vector indexing, retrieval, and large language model inference in a modular architecture aimed at context-aware question answering. It is designed for FAQ and document knowledge bases, supports structured extraction from PDFs, Word files, images, and other formats, and includes cross-knowledge-base retrieval, configurable prompts, multi-turn conversation controls, and agent workflows that can call built-in tools, MCP tools, and web search services.
Recent releases focus heavily on integration breadth and operational flexibility. v0.3.5 Highlights: Telegram, DingTalk & Mattermost IM Integration: Added Telegram bot (webhook/long-polling, streaming via editMessageText), DingTalk bot (webhook/Stream mode, AI Card streaming), and Mattermost adapter; IM channel coverage now includes WeCom, Feishu, Slack, Telegram, DingTalk, and Mattermost. v0.3.5 also adds Suggested Questions, VLM Auto-Describe MCP Tool Images, a Novita Artificial Intelligence provider with OpenAI-compatible APIs, channel tracking for traceability, and bug fixes covering empty responses, UTF-8 truncation, API key encryption loss, vLLM streaming reasoning propagation, and rerank empty passage errors. v0.3.4 Highlights: NVIDIA Model API: Support NVIDIA chat model API with custom endpoint and VLM model configuration. Weaviate Vector DB: Added Weaviate as a new vector database backend for knowledge retrieval. The same release also introduced AWS S3 storage, AES-256-GCM encryption at rest, a built-in MCP service, hybrid search optimization, and a final_answer tool for agent workflows.
Earlier versions broadened the system across retrieval, storage, deployment, and collaboration. v0.3.3 added parent-child chunking, knowledge base pinning, fallback responses, and Milvus support. v0.3.2 introduced Knowledge Search, parser and storage engine configuration, document preview, Mermaid rendering, remote URL knowledge ingestion, and async re-parse APIs. v0.3.0 delivered shared spaces, custom agents, a Data Analyst agent, Helm deployment support, Korean language support, and security hardening including SSRF-safe HTTP handling and sandboxed execution. v0.2.0 established the ReACT agent mode, multi-type knowledge bases, configurable conversation strategy, extensible web search, MCP tool integration, and asynchronous task management.
Deployment and interface options are central to the project. The platform supports local, Docker, and Kubernetes setups, with fast development tooling for backend and frontend hot reload workflows. Web UI: http://localhost. Backend API: http://localhost:8080. Jaeger Tracing: http://localhost:16686. WeKnora can also be accessed through an MCP server, and Tencent positions it as the core framework behind the WeChat Dialog Open Platform for zero-code intelligent Q&A deployments inside the WeChat ecosystem. The repository shows strong community traction, with 13.6k stars, 1.6k forks, and 26 releases, with v0.3.5 listed as the latest release on Mar 27, 2026.
