Tether Data launches QVAC Fabric LLM for edge-first Artificial Intelligence inference and fine-tuning

Tether Data on December 2, 2025 released QVAC Fabric LLM, an edge-first LLM inference runtime and fine-tuning framework that runs and personalizes models on consumer GPUs, laptops, and smartphones. The open-source platform enables on-device Artificial Intelligence training and inference across iOS, Android, Windows, macOS, and Linux while avoiding cloud dependency and vendor lock-in.

On December 2, 2025 Tether Data announced QVAC Fabric LLM, a unified LLM inference runtime and generalized LoRA fine-tuning framework designed to run, train, and personalize large language models directly on everyday hardware. The release positions consumer GPUs, laptops, servers, and even smartphones as viable platforms for full LLM execution and instruction-tuning, aiming to eliminate cloud dependency, avoid vendor lock-in, and keep sensitive data on device. Tether Data describes the system as cross-platform and portable across mobile operating systems and traditional desktop and server environments.

A notable technical advance in QVAC Fabric LLM is support for fine-tuning on smartphone-class GPUs, including Qualcomm Adreno and ARM Mali, marking what the company calls the first production-ready framework for modern LLM training on mobile chips. The runtime expands the llama.cpp ecosystem by adding fine-tuning workflows for contemporary models such as LLama3, Qwen3, and Gemma3 and enables training across AMD, Intel, NVIDIA, Apple Silicon, and various mobile accelerators. That broadened hardware compatibility is presented as a way to diversify the compute base available for Artificial Intelligence development and to treat consumer devices as legitimate training platforms.

For enterprises, QVAC Fabric LLM lets organizations fine-tune models in-house on secure hardware to meet privacy, regulatory, and cost requirements while decentralizing model customization away from centralized GPU clusters. Paolo Ardoino, CEO of Tether, said the product reflects a commitment to making Artificial Intelligence accessible and resilient, and that QVAC Fabric LLM allows people and companies to execute inference and fine-tune models “on their own terms, on their own hardware, with full control of their data.” The project is released under the Apache 2.0 license and includes multi-platform binaries and adapters on Hugging Face so developers can begin fine-tuning with a few commands.

QVAC Fabric LLM is framed as part of Tether Data’s broader effort to enable localized, privacy-first Artificial Intelligence through its QVAC initiative, which aims to make advanced personalization available on edge hardware for greater resilience and operational continuity in high-latency or emerging market environments.

68

Impact Score

Artificial Intelligence video tools turn viewers into creators

Artificial Intelligence video generation is transforming video production costs, workflows, and access, allowing solo creators to produce cinematic content at scale. New multimodal models are lowering technical barriers while raising fresh legal and ethical questions.

OpenAI debuts GPT-5.4 with native computer control

OpenAI’s GPT-5.4 introduces native computer control to move beyond chat, while Lightricks’ LTX-2.3 brings local Artificial Intelligence video generation and Anthropic rolls out a job impact tracker.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.