Tether Data launches QVAC Fabric LLM for edge-first Artificial Intelligence inference and fine-tuning

Tether Data on December 2, 2025 released QVAC Fabric LLM, an edge-first LLM inference runtime and fine-tuning framework that runs and personalizes models on consumer GPUs, laptops, and smartphones. The open-source platform enables on-device Artificial Intelligence training and inference across iOS, Android, Windows, macOS, and Linux while avoiding cloud dependency and vendor lock-in.

On December 2, 2025 Tether Data announced QVAC Fabric LLM, a unified LLM inference runtime and generalized LoRA fine-tuning framework designed to run, train, and personalize large language models directly on everyday hardware. The release positions consumer GPUs, laptops, servers, and even smartphones as viable platforms for full LLM execution and instruction-tuning, aiming to eliminate cloud dependency, avoid vendor lock-in, and keep sensitive data on device. Tether Data describes the system as cross-platform and portable across mobile operating systems and traditional desktop and server environments.

A notable technical advance in QVAC Fabric LLM is support for fine-tuning on smartphone-class GPUs, including Qualcomm Adreno and ARM Mali, marking what the company calls the first production-ready framework for modern LLM training on mobile chips. The runtime expands the llama.cpp ecosystem by adding fine-tuning workflows for contemporary models such as LLama3, Qwen3, and Gemma3 and enables training across AMD, Intel, NVIDIA, Apple Silicon, and various mobile accelerators. That broadened hardware compatibility is presented as a way to diversify the compute base available for Artificial Intelligence development and to treat consumer devices as legitimate training platforms.

For enterprises, QVAC Fabric LLM lets organizations fine-tune models in-house on secure hardware to meet privacy, regulatory, and cost requirements while decentralizing model customization away from centralized GPU clusters. Paolo Ardoino, CEO of Tether, said the product reflects a commitment to making Artificial Intelligence accessible and resilient, and that QVAC Fabric LLM allows people and companies to execute inference and fine-tune models “on their own terms, on their own hardware, with full control of their data.” The project is released under the Apache 2.0 license and includes multi-platform binaries and adapters on Hugging Face so developers can begin fine-tuning with a few commands.

QVAC Fabric LLM is framed as part of Tether Data’s broader effort to enable localized, privacy-first Artificial Intelligence through its QVAC initiative, which aims to make advanced personalization available on edge hardware for greater resilience and operational continuity in high-latency or emerging market environments.

68

Impact Score

French Artificial Intelligence startup Mistral unveils Mistral 3 open-source models

French Artificial Intelligence startup Mistral unveiled Mistral 3, a next-generation family of open-source models that includes small dense models 14B, 8B, and 3B and a larger sparse mixture-of-experts called Mistral Large 3. The company said the release represents its most capable model to date and noted Microsoft backing.

Artificial Intelligence newsroom: Anthropic’s new model redefines coding

Anthropic released Claude Opus 4.5, a new large language model that scored 80% on the SWE verified benchmark and took the no. 1 spot on the ARC AGI test. Enterprise Artificial Intelligence adoption is accelerating, with full implementation up 282%, while the U.S. Genesis Mission opens petabytes of lab data to foundation model teams.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.