Jorjin Technologies chairman Tom Liang frames AR glasses as a transformative milestone for the display industry, calling them a revolution that integrates several core technologies. According to Liang, the new generation of devices brings together near-eye displays, microLED, and waveguides, establishing a technology stack that underpins the next phase of wearable visual computing. He emphasizes that this momentum is being powered by advances in both chips and display components, which are converging to enable more capable and compact systems.
Liang’s view places near-eye displays, microLED, and waveguides at the center of AR glasses design, identifying them as the key technologies required to deliver the category’s promise. While each component plays a distinct role, their combination defines how images are generated, guided, and presented close to the eye in lightweight form factors. This integration signals a clear direction for display system engineering that is tightly linked to progress in semiconductor performance and power efficiency.
Crucially, Liang argues that Artificial Intelligence and AR are now developing in a symbiotic relationship. As he describes it, the trajectory of AR hardware is intertwined with the evolution of compute, with chip-level breakthroughs and display innovations reinforcing one another. The result, he suggests, is a feedback loop in which better processing unlocks richer experiences for AR glasses, and the demands of those experiences, in turn, push further improvements in silicon and optics. In this framing, AR glasses become a focal point where display technologies and semiconductor advances align, and where Artificial Intelligence increasingly shapes how users interact with information in real time.