-
Dual-domain architecture shows almost 40 times higher energy efficiency for running neural networks
Many conventional computer architectures are ill-equipped to meet the computational demands of machine learning-based models. In recent years, some engineers have thus been trying to design alternative architectures that could be better suited for running these models.... Read more
-
To keep hardware safe, new 'Oreo' method cuts out the code's clues
Imagine you're a chef with a highly sought-after recipe. You write your top-secret instructions in a journal to ensure you remember them, but its location within the book is evident from the folds and tears on the edges of that often-referenced page.... Read more
-
Sustainable SOT-MRAM memory technology could replace cache memory in computer architecture in the future
How much energy is consumed each time we upload an image to social media, which relies on data centers and cloud storage? Data centers currently account for about 1% of global energy consumption, amounting to 200 terawatt-hours of electricity annually. This immense energy demand has driven researchers to explore innovative... Read more
-
Smarter memory paves the way for EU independence in computer manufacturing
New technology from Chalmers University of Technology and the University of Gothenburg, Sweden, is helping the EU establish its own competitive computer manufacturing industry. Researchers have developed components critical for optimizing on-chip memory, a key factor in enhancing the performance of next-generation computers.... Read more
-
Security vulnerabilities discovered in Apple processors
The US tech giant Apple has always advertised security assurances alongside ever faster processor performance for its products.... Read more
-
Big Tech wants to plug data centers right into power plants. Utilities say it's not fair
Looking for a quick fix for their fast-growing electricity diets, tech giants are increasingly looking to strike deals with power plant owners to plug in directly, avoiding a potentially longer and more expensive process of hooking into a fraying electric grid that serves everyone else.... Read more
-
Scaling up neuromorphic computing for more efficient and effective AI everywhere and anytime
Neuromorphic computing—a field that applies principles of neuroscience to computing systems to mimic the brain's function and structure—needs to scale up if it is to effectively compete with current computing methods.... Read more
-
Tiny chip could offer spectral sensing for everyday devices
Imagine smartphones that can diagnose diseases, detect counterfeit drugs or warn of spoiled food. Spectral sensing is a powerful technique that identifies materials by analyzing how they interact with light, revealing details far beyond what the human eye can see.... Read more
-
Analog computing platform based on one-memristor array efficiently processes real-time videos
As artificial intelligence models become increasingly advanced, electronics engineers have been trying to develop new hardware that is better suited for running these models, while also limiting power-consumption and boosting the speed at which they process data. Some of the most promising solutions designed to meet the needs of machine... Read more
-
Next-gen AI device utilizes ion-controlled spin wave interference in magnetic materials
A research team from NIMS and the Japan Fine Ceramics Center (JFCC) has developed a next-generation AI device—a hardware component for AI systems—that incorporates an iono-magnonic reservoir. This reservoir controls spin waves (collective excitations of electron spins in magnetic materials), ion dynamics and their interactions.... Read more
-
Next generation computers: New wiring material could transform chip technology
The rapid technological advancements of our world have been enabled by our capacity to design and fabricate ever smaller electronic chips. These underpin computers, mobile phones and every smart device deployed to date.... Read more
-
Ultra-small neuromorphic chip learns and corrects errors autonomously
Existing computer systems have separate data processing and storage devices, making them inefficient for processing complex data like AI. A KAIST research team has developed a memristor-based integrated system similar to the way our brain processes information. It is now ready for application in various devices, including smart security cameras,... Read more
-
How a neuromorphic chip could benefit industry
Neuromorphic chips that process information like the human brain—this is the goal of physicist Heidemarie Krüger and her Dresden-based startup Techifab. The researcher from the Leibniz Institute of Photonic Technology and the Friedrich Schiller University of Jena is developing a technology that processes and stores data directly at the point... Read more
-
AI comes down from the cloud as chips get smarter
Artificial intelligence is moving from data centers to "the edge" as computer makers build the technology into laptops, robots, cars and more devices closer to home.... Read more
-
Open-source holonomic mobile manipulator could advance household robotics
Researchers at Stanford University, Princeton University, and Dexterity recently developed TidyBot++, a holonomic mobile robot that can perform various household chores and could help to train or test new algorithms for robotics applications.... Read more
-
Specialized hardware solves high-order optimization problems with in-memory computing
The rise of AI, graphic processing, combinatorial optimization and other data-intensive applications has resulted in data-processing bottlenecks, as ever greater amounts of data must be shuttled back and forth between the memory and compute elements in a computer. The physical distance is small, but the process can occur billions of... Read more
-
Nvidia's new GPU series led an avalanche of entertainment-related announcements at CES
In a packed Las Vegas arena, Nvidia founder Jensen Huang stood on stage and marveled over the crisp real-time computer graphics displayed on the screen behind him. He watched as a dark-haired woman walked through ornate gilded double doors and took in the rays of light that poured in through... Read more
-
Biggest Nvidia takeaways from Jensen Huang's CES 2025 keynote
Nvidia CEO Jensen Huang unveiled a suite of new products, services and partnerships at CES 2025.... Read more
-
Nvidia founder Jensen Huang unveils next generation of AI and gaming chips at CES 2025
In a packed Las Vegas arena, Nvidia founder Jensen Huang stood on stage and marveled over the crisp real-time computer graphics displayed on the screen behind him. He watched as a dark-haired woman walked through ornate gilded double doors and took in the rays of light that poured in through... Read more
-
EU universal charger rules come into force
EU rules requiring all new smartphones, tablets and cameras to use the same charger came into force on Saturday, in a change Brussels said will cut costs and waste.... Read more
-
Engineers grow 'high-rise' 3D chips, enabling more efficient AI hardware
The electronics industry is approaching a limit to the number of transistors that can be packed onto the surface of a computer chip. So, chip manufacturers are looking to build up rather than out.... Read more
-
Low-cost polymer boosts high-density data storage performance and sustainability
A new material for high density data storage can be erased and recycled in a more efficient and sustainable way, providing a potential alternative to hard disk drives, solid-state drives and flash memory in the future.... Read more
-
Co-packaged optics enhance AI computing with high-speed connectivity
Optical fibers carry voice and data at high speeds across long distances, and IBM Research scientists are bringing this speed and capacity somewhere they haven't previously gone: inside data centers and onto circuit boards, where they will help accelerate generative AI computing.... Read more
-
Huawei's new Mate 70 phone shows its chip advances are stalling
Huawei Technologies Co.'s latest flagship smartphone is powered by a chip little different from the one that set off alarm bells in Washington a year ago, signaling a slowdown in the Chinese company's tech advances.... Read more
-
Autonomous vehicle safety: PreFixer makes sensor hardware swaps less dangerous
Devices called encoders, located inside the sensors on autonomous cars, convert motion and other information acquired on the road into electrical signals that can then be used as feedback for the software running the vehicle to make informed decisions.... Read more
-
Compact LCOS microdisplay uses fast CMOS backplane for high-speed light modulation
Researchers from the Fraunhofer Institute for Photonic Microsystems IPMS, in collaboration with HOLOEYE Photonics AG, have developed a compact LCOS microdisplay with high refresh rates that enables improved optical modulation. This innovative microdisplay will be presented for the first time at the 31st International Display Workshops (IDW 2024) held Dec.... Read more
-
Optoelectronic device mimics human vision for diversified in-sensor computing
To make sense of the world, most humans rely in great part on their vision. Recent research suggests that the human visual system is hierarchical, meaning that it processes information on different levels, ranging from the low-level processing of sensory stimuli to the high-level processing associated with more advanced cognitive... Read more
-
Nvidia rivals focus on building a different kind of chip to power AI products
Building the current crop of artificial intelligence chatbots has relied on specialized computer chips pioneered by Nvidia, which dominates the market and made itself the poster child of the AI boom.... Read more
-
Frontier supercomputer hits new highs in third year of exascale
Two-and-a-half years after breaking the exascale barrier, the Frontier supercomputer at the Department of Energy's Oak Ridge National Laboratory continues to set new standards for its computing speed and performance.... Read more
-
Lawrence Livermore supercomputer is crowned world's speediest
The Bay Area has just won a coveted crown in computing, with a massive new machine at Lawrence Livermore National Laboratory deemed the most powerful system in the world.... Read more
-
AI-based tool creates simple interfaces for virtual and augmented reality
A paper published in Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology, by researchers in Carnegie Mellon University's Human-Computer Interaction Institute, introduces EgoTouch, a tool that uses artificial intelligence to control AR/VR interfaces by touching the skin with a finger.... Read more
-
Solving complex problems faster: Innovations in Ising machine technology
Computers are essential for solving complex problems in fields such as scheduling, logistics, and route planning, but traditional computers struggle with large-scale combinatorial optimization, as they can't efficiently process vast numbers of possibilities. To address this, researchers have explored specialized systems.... Read more
-
Video: Using haptic technology to deliver real-time performance feedback
Can we ever see too much data? Yes, actually. In certain situations, visual overload can paralyze decision-making. Adding one more screen, one more monitor, one more chart, table, ticker or graph becomes counterproductive.... Read more
-
Haptic hardware offers waterfall of immersive experience, could someday aid blind users
Increasingly sophisticated computer graphics and spatial 3D sound are combining to make the virtual world of games bigger, badder and more beautiful than ever. And beyond sight and sound, haptic technology can create a sense of touch—including vibrations in your gaming chair from an explosion, or difficulty turning the wheel... Read more
-
Software package can bypass CPU for more efficient computing
Technion Researchers have developed a software package that enables computers to perform processing operations directly in memory, bypassing the CPU. This is a significant step toward developing computers that perform calculations in memory, avoiding time-consuming and energy-intensive data transfers between hardware components.... Read more
-
Innovative transistor for reconfigurable fuzzy logic hardware shows promise for enhanced edge computing
Edge computing devices, devices located in proximity to the source of data instead of in large data centers, could perform computations locally. This could reduce latency, particularly in real-time applications, as it would minimize the need to transfer data from the cloud.... Read more
-
Portable light system uses color and texture change to digitize everyday objects
When Nikola Tesla predicted we'd have handheld phones that could display videos, photographs, and more, his musings seemed like a distant dream. Nearly 100 years later, smartphones are like an extra appendage for many of us.... Read more
-
Ultra-low power neuromorphic hardware show promise for energy-efficient AI computation
A team including researchers from Seoul National University College of Engineering has developed neuromorphic hardware capable of performing artificial intelligence (AI) computations with ultra-low power consumption. The research, published in the journal Nature Nanotechnology, addresses fundamental issues in existing intelligent semiconductor materials and devices while demonstrating potential for array-level technology.... Read more
-
Generative AI could generate millions of tons of e-waste by decade's end, study finds
A team of urban environmentalists at the Chinese Academy of Sciences' Institute of Urban Environment, working with a colleague from Reichman University in Israel, has attempted to estimate the amount of e-waste that will be generated over the next several years due to the implementation of generative AI applications.... Read more
-
Magnetic RAM-based architecture could pave way for implementing neural networks on edge IoT devices
There are, without a doubt, two broad technological fields that have been developing at an increasingly fast pace over the past decade: artificial intelligence (AI) and the Internet of Things (IoT).... Read more
-
AI mimics neocortex computations with 'winner-take-all' approach
Over the past decade or so, computer scientists have developed increasingly advanced computational techniques that can tackle real-world tasks with human-comparable accuracy. While many of these artificial intelligence (AI) models have achieved remarkable results, they often do not precisely replicate the computations performed by the human brain.... Read more
-
Semiconductor-free logic gates pave the way for fully 3D-printed active electronics
Active electronics—components that can control electrical signals—usually contain semiconductor devices that receive, store, and process information. These components, which must be made in a clean room, require advanced fabrication technology that is not widely available outside a few specialized manufacturing centers.... Read more
-
New infrared camera aims to enhance safety in autonomous driving
Fall is here, bringing rain, fog, and early darkness. For road users, this means heightened caution, as visibility conditions increasingly deteriorate. Thermal imaging cameras that can reliably detect people even in poor or limited visibility conditions can ensure greater safety here. This is particularly true for autonomous vehicles where there... Read more
-
New methodology enables design of cloud servers for lower carbon
Reducing carbon emissions is crucial to curbing the effects of climate change, but usually gas-powered vehicles and manufacturers are the most conspicuous culprits. However, Information and Communication Technology (ICT) is currently responsible for between 2 and 4% of the global carbon footprint, which is on par with aviation emissions.... Read more
-
Research team develops hardware architecture for post-quantum cryptography
Integrating post-quantum security algorithms into hardware has long been considered a challenge. But a research team at TU Graz has now developed hardware for NIST post-quantum cryptography standards with additional security measures for this purpose.... Read more
-
Germany inaugurates IBM's first European quantum data center
Chancellor Olaf Scholz on Tuesday inaugurated US firm IBM's first quantum data center in Europe, saying Germany aims to be at the forefront of the revolutionary technology.... Read more
-
A pruning approach for neural network design optimized for specific hardware configurations
Neural network pruning is a key technique for deploying artificial intelligence (AI) models based on deep neural networks (DNNs) on resource-constrained platforms, such as mobile devices. However, hardware conditions and resource availability vary greatly across different platforms, making it essential to design pruned models optimally suited to specific hardware configurations.... Read more
-
How a national lab retires—and shreds—large computing resources
Ever wonder what happens to massive supercomputing systems when they're retired? Surprisingly, when it comes to the data, it's not too different from disposing of old documents—they go straight into a shredder and sent to recycling.... Read more
-
New load balancing method enhances multiplayer game performance
Online gaming is increasingly popular. As such, server efficiency is becoming an increasingly urgent priority. With millions of players interacting in real-time, game servers are under enormous pressure to process a huge amount of data without latency (delays) or crashes. Research in the International Journal of Information and Communication Technology... Read more
-
Neuromorphic platform presents significant leap forward in computing efficiency
Researchers at the Indian Institute of Science (IISc) have developed a brain-inspired analog computing platform capable of storing and processing data in an astonishing 16,500 conductance states within a molecular film. Published today in the journal Nature, this breakthrough represents a huge step forward over traditional digital computers in which... Read more