Phison expands local Artificial Intelligence inferencing with flash memory

Phison is using its GTC showcase to position flash storage as an added memory tier for local Artificial Intelligence systems. The company says its aiDAPTIV technology extends working memory across GPU memory, system RAM and flash to support larger models and long-context inference.

Phison Electronics announced at GTC, at booth 119, a showcase focused on multi-tier memory architecture for NVIDIA-powered local Artificial Intelligence platforms. The company is targeting a growing memory constraint as demand for Artificial Intelligence-ready platforms continues to rise, particularly for workloads involving larger models and long-context inference.

Fine-tuning and inference on proprietary data require massive compute and memory resources, creating investment challenges for organizations. Rising solution costs and workflow bottlenecks are slowing time-to-market for revenue-generating innovation. To address this challenge, Phison introduced aiDAPTIV technology for local and edge Artificial Intelligence use cases, using Pascari SSDs as a new Artificial Intelligence memory tier.

Phison says aiDAPTIV intelligently extends and manages Artificial Intelligence working memory across GPU memory, system RAM and flash. The company presented the technology as a way to apply multi-tier memory architecture principles to local Artificial Intelligence systems as NVIDIA infrastructure advances GPU memory capabilities for inference workloads in data center environments.

Built on high-endurance flash optimized for sustained paging and context retention, aiDAPTIV is designed to support memory-intensive inference and fine-tuning workloads under fixed hardware configurations. Phison says the flash-based memory tier enables organizations to support evolving workloads on local systems while maintaining data privacy and improving long-term infrastructure efficiency.

50

Impact Score

What businesses need to know about the EU cyber resilience act

The EU cyber resilience act is turning product cybersecurity into a legal requirement for companies that sell digital products into the European Union. A key compliance milestone arrives in September 2026, well before the full regulation takes effect in 2027.

Claude Mythos and cyber insurance’s next inflection point

Claude Mythos is being treated by governments and regulators as a potential systemic cyber risk with implications for financial stability and insurance markets. Its emergence is intensifying pressure on insurers to clarify whether Artificial Intelligence-enabled cyber losses are covered, excluded, or require new stand-alone products.

OpenAI expands ChatGPT ads with self-serve manager

OpenAI is widening its ChatGPT ads pilot with a beta self-serve Ads Manager, new bidding options and broader measurement tools. The push signals a deeper move into advertising as the company expands the program into several international markets.

OpenAI launches Artificial Intelligence deployment consulting unit

OpenAI has created a new consulting and deployment business aimed at helping enterprises build and roll out Artificial Intelligence systems. The move mirrors a similar push by Anthropic and signals a broader effort by model providers to capture more of the enterprise services market.

SK Group warns DRAM shortages could curb memory use

SK Group chairman Chey Tae-won warned that customers may reduce memory consumption through infrastructure and software optimization if DRAM suppliers fail to raise output. Demand from Artificial Intelligence data centers is keeping the market tight as memory makers weigh expansion against the long timelines for new fabs.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.