PaTH Attention boosts large language model positional reasoning

Researchers at MIT and the MIT-IBM Watson Artificial Intelligence Lab have introduced PaTH Attention, a new positional encoding method that makes transformers more context-aware and better at tracking state over long sequences. The technique adapts position information based on token content and can be combined with forgetting mechanisms to improve long-context reasoning and efficiency.

China reportedly tests domestically built euv lithography prototype

China has reportedly built and begun testing a domestically developed euv lithography prototype assembled from second-hand components and reverse-engineered designs. Huawei is leading a broader effort to create a fully domestic artificial intelligence semiconductor supply chain spanning chip design to advanced manufacturing tools.