-
Integrated photonic millimeter-wave radar chip developed for next-generation radar 6G networks
In a breakthrough in radar technology, researchers from the City University of Hong Kong (CityUHK) have developed the world's first integrated photonic millimeter-wave radar chip, achieving unprecedented precision in a remarkably compact device. This breakthrough represents a significant step forward in the development of Integrated Sensing and Communication (ISAC) networks,... Read more -
A new route to optimize AI hardware: Homodyne gradient extraction
A team led by the BRAINS Center for Brain-Inspired Computing at the University of Twente has demonstrated a new way to make electronic materials adapt in a manner comparable to machine learning. Their study, published in Nature Communications, introduces a method for physical learning that does not require software algorithms... Read more -
Smarter AI processing for cleaner air: A blueprint to cut pollution and prolong server life
As artificial intelligence becomes more powerful and widespread, so does the environmental cost of running it. Behind every chatbot, image generator, and television streaming recommendation are massive banks of millions of computers housed in an increasing number of data centers that consume staggering amounts of electricity and water to keep... Read more -
A brain-like chip interprets neural network connectivity in real time
The ability to analyze the brain's neural connectivity is emerging as a key foundation for brain-computer interface (BCI) technologies, such as controlling artificial limbs and enhancing human intelligence. To make these analyses more precise, it is critical to quickly and accurately interpret the complex signals from many neurons in the... Read more -
Germany hopes new data center can help bring 'digital sovereignty'
A new mega data center is slated to rise in a rural stretch of eastern Germany in what backers hope is a starting point for a European AI sector that can compete with the United States and China.... Read more -
Artificial neuron can mimic different parts of the brain—a major step toward human-like robotics
Robots that can sense and respond to the world like humans may soon be a reality as scientists have created an artificial neuron capable of mimicking different parts of the brain.... Read more -
Novel memristor wafer integration technology paves the way for brain-like AI chips
A research team led by Professor Sanghyeon Choi from the Department of Electrical Engineering and Computer Science at DGIST has successfully developed a memristor, which is gaining recognition as a next-generation semiconductor device, through mass-integration at the wafer scale.... Read more -
Scalable memtransistor arrays show potential for energy-efficient artificial neural networks
Researchers at the National University of Singapore (NUS) have fabricated ultra-thin memtransistor arrays from two-dimensional (2D) transition metal dichalcogenide (TMDC) with controllable Schottky barriers. These arrays are highly uniform, demonstrating low device-to-device variation and provide high performance for image recognition tasks.... Read more -
RRAM-based analog computing system rapidly solves matrix equations with high precision
Analog computers are systems that perform computations by manipulating physical quantities such as electrical current, that map math variables, instead of representing information using abstraction with discrete binary values (i.e., 0 or 1), like digital computers.... Read more -
Neuromorphic computer prototype learns patterns with fewer computations than traditional AI
Could computers ever learn more like humans do, without relying on artificial intelligence (AI) systems that must undergo extremely expensive training?... Read more -
AI efficiency advances with spintronic memory chip that combines storage and processing
To make accurate predictions and reliably complete desired tasks, most artificial intelligence (AI) systems need to rapidly analyze large amounts of data. This currently entails the transfer of data between processing and memory units, which are separate in existing electronic devices.... Read more -
Beyond electronics: Optical system performs feature extraction with unprecedented low latency
Many modern artificial intelligence (AI) applications, such as surgical robotics and real-time financial trading, depend on the ability to quickly extract key features from streams of raw data. This process is currently bottlenecked by traditional digital processors. The physical limits of conventional electronics prevent the reduction in latency and the... Read more -
Unified memristor-ferroelectric memory developed for energy-efficient training of AI systems
Over the past decades, electronics engineers have developed a wide range of memory devices that can safely and efficiently store increasing amounts of data. However, the different types of devices developed to date come with their own trade-offs, which pose limits on their overall performance and restrict their possible applications.... Read more -
Reinventing computer technology for the benefit of data centers
In the largest computer systems project ever undertaken at EPFL, an international team of researchers has come up with a new way of building computers to help tackle the increasing challenges faced by data centers.... Read more -
Team develops high-speed, ultra-low-power superconductive neuron device
A research team has developed a neuron device that holds potential for application in large-scale, high-speed superconductive neural network circuits. The device operates at high speeds with ultra-low-power consumption and is tolerant to parameter fluctuations during circuit fabrication.... Read more -
Group including Nvidia, BlackRock buying Aligned Data Centers in deal worth about $40 billion
A group including BlackRock, Nvidia and Microsoft is buying Aligned Data Centers in an approximately $40 billion deal in an effort to expand next-generation cloud and artificial intelligence infrastructure.... Read more -
AI-powered method helps protect global chip supply chains from cyber threats
University of Missouri researchers have used artificial intelligence to detect hidden hardware trojans through a method that's 97% accurate.... Read more -
Vulnerability in confidential cloud environments uncovered
Some data is so sensitive that it is processed only in specially protected cloud areas. These are designed to ensure that not even a cloud provider can access the data. ETH Zurich researchers have now found a vulnerability that could allow hackers to breach these confidential environments.... Read more -
Next-generation memory: Tungsten-based SOT-MRAM achieves nanosecond switching and low-power data storage
The ability to reliably switch the direction of magnetic alignment in materials, a process known as magnetization switching, is known to be central to the functioning of most memory devices. One known strategy to achieve entails the creation of a rotational force (i.e., torque) on electron spins via an electric... Read more -
AI tools can help hackers plant hidden flaws in computer chips, study finds
Widely available artificial intelligence systems can be used to deliberately insert hard-to-detect security vulnerabilities into the code that defines computer chips, according to new research from the NYU Tandon School of Engineering, a warning about the potential weaponization of AI in hardware design.... Read more -
New haptic system lets soft objects respond to taps, squeezes and twists
New technology that invites expressive, two-way communication between a person and the soft, flexible object they are holding or wearing has been developed at the University of Bath.... Read more -
Hardware vulnerability allows attackers to hack AI training data
Researchers from NC State University have identified the first hardware vulnerability that allows attackers to compromise the data privacy of artificial intelligence (AI) users by exploiting the physical hardware on which AI is run.... Read more -
Back to the future: Is light-speed analog computing on the horizon?
Scientists have achieved a breakthrough in analog computing, developing a programmable electronic circuit that harnesses the properties of high-frequency electromagnetic waves to perform complex parallel processing at light-speed.... Read more -
The lord of the ring mouse—a lightweight, ring-based computer mouse lasts over a month on a single charge
As the use of wearable devices such as augmented reality (AR) glasses has slowly but steadily increased, so too has the desire to control these devices in an easy and convenient way. Ring controllers worn on the finger already exist, but usually have some drawbacks in their size, weight or... Read more -
Cracking a long-standing weakness in a classic algorithm for programming reconfigurable chips
Researchers from EPFL, AMD, and the University of Novi Sad have uncovered a long-standing inefficiency in the algorithm that programs millions of reconfigurable chips used worldwide, a discovery that could reshape how future generations of these are designed and programmed.... Read more -
Semiconductor neuron mimics brain's memory and adaptive response abilities
The human brain does more than simply regulate synapses that exchange signals; individual neurons also process information through intrinsic plasticity, the adaptive ability to become more sensitive or less sensitive depending on context. Existing artificial intelligence semiconductors, however, have struggled to mimic this flexibility of the brain.... Read more -
Miniaturized ion traps show promise of 3D printing for quantum-computing hardware
Researchers at Lawrence Livermore National Laboratory (LLNL), the University of California (UC) Berkeley, UC Riverside and UC Santa Barbara have miniaturized quadrupole ion traps for the first time with 3D printing—a breakthrough in one of the most promising approaches to building a large-scale quantum computer.... Read more -
New design tackles integer factorization problems through digital probabilistic computing
Probabilistic Ising machines (PIMs) are advanced and specialized computing systems that could tackle computationally hard problems, such as optimization or integer factorization tasks, more efficiently than classical systems. To solve problems, PIMs rely on interacting probabilistic bits (p-bits), networks of interacting units of digital information with values that randomly fluctuate... Read more -
Material that listens: Chip-based approach enables speech recognition and more
Speech recognition without heavy software or energy-hungry processors: researchers at the University of Twente, together with IBM Research Europe and Toyota Motor Europe, present a completely new approach. Their chips allow the material itself to "listen." The publication by Prof. Wilfred van der Wiel and colleagues appears today in Nature.... Read more -
New chip design cuts AI energy use by enabling smarter FPGA processing
A new innovation from Cornell researchers lowers the energy use needed to power artificial intelligence—a step toward shrinking the carbon footprints of data centers and AI infrastructure.... Read more -
Silent deep bass: Wearable audio you can feel
Researchers at University of Tsukuba have developed a portable, silent subwoofer that combines electrical muscle stimulation with low-frequency vibrations. This device enables users to physically feel deep bass in virtual reality (VR) and everyday music. While minimizing noise, it provides an immersive experience and rhythm perception comparable to conventional speakers,... Read more -
DNA cassette tapes could solve global data storage problems
Our increasingly digitized world has a data storage problem. Hard drives and other storage media are reaching their limits, and we are creating data faster than we can store it. Fortunately, we don't have to look too far for a solution, because nature already has a powerful storage medium with... Read more -
Can Microsoft's analog optical computer be the answer to more energy-efficient AI and optimization tasks?
The constant scaling of AI applications and other digital technologies across industries is beginning to tax the energy grid due to its intensive energy consumption. Digital computing's energy and latency demands will likely continue to rise, challenging their sustainability.... Read more -
New non-volatile memory platform built with covalent organic frameworks
Researchers at Institute of Science Tokyo have created a new material platform for non-volatile memories using covalent organic frameworks (COFs), which are crystalline solids with high thermal stability. The researchers successfully installed electric-field-responsive dipolar rotors into COFs.... Read more -
Hybrid 3D printing method boosts strength of eco-friendly parts with less plastic
3D printing has come a long way since its invention in 1983 by Chuck Hull, who pioneered stereolithography, a technique that solidifies liquid resin into solid objects using ultraviolet lasers.... Read more -
Europe bets on supercomputer to catch up in AI race
Europe's fastest supercomputer Jupiter was inaugurated Friday in Germany with Chancellor Friedrich Merz saying it could help the continent catch up in the global artificial intelligence race.... Read more -
Scientist discusses development of artificial synapses that mimic human brain function for next-gen AI chips
The Emerging Investigator Series by the journal Materials Horizons features outstanding work by young researchers in the field of materials science. In the latest Editorial, Dr. Eunho Lee, an Assistant Professor of Chemical and Biomolecular Engineering at Seoul National University of Science and Technology, Republic of Korea, where he leads... Read more -
Oxygen defects help unlock the secret of next-generation memory
Resistive random access memory (ReRAM), which is based on oxide materials, is gaining attention as a next-generation memory and neuromorphic computing device. Its fast speeds, data retention ability, and simple structure make it a promising candidate to replace existing memory technologies.... Read more -
Scientists develop the world's first 6G chip, capable of 100 Gbps speeds
Sixth generation, or 6G, wireless technology is one step closer to reality with news that Chinese researchers have unveiled the world's first "all-frequency" 6G chip. The chip is capable of delivering mobile internet speeds exceeding 100 gigabits per second (Gbps) and was developed by a team led by scientists from... Read more -
Artificial neuron merges DRAM with MoS₂ circuits to better emulate brain-like adaptability
The rapid advancement of artificial intelligence (AI) and machine learning systems has increased the demand for new hardware components that could speed up data analysis while consuming less power. As machine learning algorithms draw inspiration from biological neural networks, some engineers have been working on hardware that also mimics the... Read more -
How terahertz beams and a quantum-inspired receiver could free multi-core processors from the wiring bottleneck
For decades, computing followed a simple rule: Smaller transistors made chips faster, cheaper, and more capable. As Moore's law slows, a different limit has come into focus. The challenge is no longer only computation; modern processors and accelerators are throttled by interconnection.... Read more -
Low-power 'microwave brain' on a chip computes on both ultrafast data and wireless signals
Cornell University researchers have developed a low-power microchip they call a "microwave brain," the first processor to compute on both ultrafast data signals and wireless communication signals by harnessing the physics of microwaves.... Read more -
Flexible transmitter chip could make wireless devices more energy efficient
Researchers from MIT and elsewhere have designed a novel transmitter chip that significantly improves the energy efficiency of wireless communications, which could boost the range and battery life of a connected device.... Read more -
Chicago's $1 billion quantum computer set to go live in 2028
The startup behind Chicago's more than $1 billion quantum computing deal said operations are expected to start in three years, a win for Illinois Governor JB Pritzker, who backed the investment and is widely seen as a potential presidential candidate.... Read more -
Scalable transformer accelerator enables on-device execution of large language models
Large language models (LLMs) like BERT and GPT are driving major advances in artificial intelligence, but their size and complexity typically require powerful servers and cloud infrastructure. Running these models directly on devices—without relying on external computation—has remained a difficult technical challenge.... Read more -
Pancaked water droplets help launch Europe's fastest supercomputer
JUPITER became the world's fourth fastest supercomputer when it debuted last month. Though housed in Germany at the Jülich Supercomputing Center (JSC), Georgia Tech played a supporting role in helping the system land on the latest TOP500 list.... Read more -
AI cloud infrastructure gets faster and greener: NPU core improves inference performance by over 60%
The latest generative AI models such as OpenAI's ChatGPT-4 and Google's Gemini 2.5 require not only high memory bandwidth but also large memory capacity. This is why generative AI cloud operating companies like Microsoft and Google purchase hundreds of thousands of NVIDIA GPUs.... Read more -
Hardware security tech can hide and reveal encryption keys on demand using 3D flash memory
Seoul National University College of Engineering announced that a research team has developed a new hardware security technology based on commercially available 3D NAND flash memory (V-NAND flash memory).... Read more -
Engineers create first AI model specialized for chip design language
Researchers at NYU Tandon School of Engineering have created VeriGen, the first specialized artificial intelligence model successfully trained to generate Verilog code, the programming language that describes how a chip's circuitry functions.... Read more -
AI models shrink to fit tiny devices, enabling smarter IoT sensors
Artificial intelligence is considered to be computationally and energy-intensive—a challenge for the Internet of Things (IoT), where small, embedded sensors have to make do with limited computing power, little memory and small batteries.... Read more