-
Compact LCOS microdisplay uses fast CMOS backplane for high-speed light modulation
Researchers from the Fraunhofer Institute for Photonic Microsystems IPMS, in collaboration with HOLOEYE Photonics AG, have developed a compact LCOS microdisplay with high refresh rates that enables improved optical modulation. This innovative microdisplay will be presented for the first time at the 31st International Display Workshops (IDW 2024) held Dec.... Read more
-
Optoelectronic device mimics human vision for diversified in-sensor computing
To make sense of the world, most humans rely in great part on their vision. Recent research suggests that the human visual system is hierarchical, meaning that it processes information on different levels, ranging from the low-level processing of sensory stimuli to the high-level processing associated with more advanced cognitive... Read more
-
Nvidia rivals focus on building a different kind of chip to power AI products
Building the current crop of artificial intelligence chatbots has relied on specialized computer chips pioneered by Nvidia, which dominates the market and made itself the poster child of the AI boom.... Read more
-
Frontier supercomputer hits new highs in third year of exascale
Two-and-a-half years after breaking the exascale barrier, the Frontier supercomputer at the Department of Energy's Oak Ridge National Laboratory continues to set new standards for its computing speed and performance.... Read more
-
Lawrence Livermore supercomputer is crowned world's speediest
The Bay Area has just won a coveted crown in computing, with a massive new machine at Lawrence Livermore National Laboratory deemed the most powerful system in the world.... Read more
-
AI-based tool creates simple interfaces for virtual and augmented reality
A paper published in Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology, by researchers in Carnegie Mellon University's Human-Computer Interaction Institute, introduces EgoTouch, a tool that uses artificial intelligence to control AR/VR interfaces by touching the skin with a finger.... Read more
-
Solving complex problems faster: Innovations in Ising machine technology
Computers are essential for solving complex problems in fields such as scheduling, logistics, and route planning, but traditional computers struggle with large-scale combinatorial optimization, as they can't efficiently process vast numbers of possibilities. To address this, researchers have explored specialized systems.... Read more
-
Video: Using haptic technology to deliver real-time performance feedback
Can we ever see too much data? Yes, actually. In certain situations, visual overload can paralyze decision-making. Adding one more screen, one more monitor, one more chart, table, ticker or graph becomes counterproductive.... Read more
-
Haptic hardware offers waterfall of immersive experience, could someday aid blind users
Increasingly sophisticated computer graphics and spatial 3D sound are combining to make the virtual world of games bigger, badder and more beautiful than ever. And beyond sight and sound, haptic technology can create a sense of touch—including vibrations in your gaming chair from an explosion, or difficulty turning the wheel... Read more
-
Software package can bypass CPU for more efficient computing
Technion Researchers have developed a software package that enables computers to perform processing operations directly in memory, bypassing the CPU. This is a significant step toward developing computers that perform calculations in memory, avoiding time-consuming and energy-intensive data transfers between hardware components.... Read more
-
Innovative transistor for reconfigurable fuzzy logic hardware shows promise for enhanced edge computing
Edge computing devices, devices located in proximity to the source of data instead of in large data centers, could perform computations locally. This could reduce latency, particularly in real-time applications, as it would minimize the need to transfer data from the cloud.... Read more
-
Portable light system uses color and texture change to digitize everyday objects
When Nikola Tesla predicted we'd have handheld phones that could display videos, photographs, and more, his musings seemed like a distant dream. Nearly 100 years later, smartphones are like an extra appendage for many of us.... Read more
-
Ultra-low power neuromorphic hardware show promise for energy-efficient AI computation
A team including researchers from Seoul National University College of Engineering has developed neuromorphic hardware capable of performing artificial intelligence (AI) computations with ultra-low power consumption. The research, published in the journal Nature Nanotechnology, addresses fundamental issues in existing intelligent semiconductor materials and devices while demonstrating potential for array-level technology.... Read more
-
Generative AI could generate millions of tons of e-waste by decade's end, study finds
A team of urban environmentalists at the Chinese Academy of Sciences' Institute of Urban Environment, working with a colleague from Reichman University in Israel, has attempted to estimate the amount of e-waste that will be generated over the next several years due to the implementation of generative AI applications.... Read more
-
Magnetic RAM-based architecture could pave way for implementing neural networks on edge IoT devices
There are, without a doubt, two broad technological fields that have been developing at an increasingly fast pace over the past decade: artificial intelligence (AI) and the Internet of Things (IoT).... Read more
-
AI mimics neocortex computations with 'winner-take-all' approach
Over the past decade or so, computer scientists have developed increasingly advanced computational techniques that can tackle real-world tasks with human-comparable accuracy. While many of these artificial intelligence (AI) models have achieved remarkable results, they often do not precisely replicate the computations performed by the human brain.... Read more
-
Semiconductor-free logic gates pave the way for fully 3D-printed active electronics
Active electronics—components that can control electrical signals—usually contain semiconductor devices that receive, store, and process information. These components, which must be made in a clean room, require advanced fabrication technology that is not widely available outside a few specialized manufacturing centers.... Read more
-
New infrared camera aims to enhance safety in autonomous driving
Fall is here, bringing rain, fog, and early darkness. For road users, this means heightened caution, as visibility conditions increasingly deteriorate. Thermal imaging cameras that can reliably detect people even in poor or limited visibility conditions can ensure greater safety here. This is particularly true for autonomous vehicles where there... Read more
-
New methodology enables design of cloud servers for lower carbon
Reducing carbon emissions is crucial to curbing the effects of climate change, but usually gas-powered vehicles and manufacturers are the most conspicuous culprits. However, Information and Communication Technology (ICT) is currently responsible for between 2 and 4% of the global carbon footprint, which is on par with aviation emissions.... Read more
-
Research team develops hardware architecture for post-quantum cryptography
Integrating post-quantum security algorithms into hardware has long been considered a challenge. But a research team at TU Graz has now developed hardware for NIST post-quantum cryptography standards with additional security measures for this purpose.... Read more
-
Germany inaugurates IBM's first European quantum data center
Chancellor Olaf Scholz on Tuesday inaugurated US firm IBM's first quantum data center in Europe, saying Germany aims to be at the forefront of the revolutionary technology.... Read more
-
A pruning approach for neural network design optimized for specific hardware configurations
Neural network pruning is a key technique for deploying artificial intelligence (AI) models based on deep neural networks (DNNs) on resource-constrained platforms, such as mobile devices. However, hardware conditions and resource availability vary greatly across different platforms, making it essential to design pruned models optimally suited to specific hardware configurations.... Read more
-
How a national lab retires—and shreds—large computing resources
Ever wonder what happens to massive supercomputing systems when they're retired? Surprisingly, when it comes to the data, it's not too different from disposing of old documents—they go straight into a shredder and sent to recycling.... Read more
-
New load balancing method enhances multiplayer game performance
Online gaming is increasingly popular. As such, server efficiency is becoming an increasingly urgent priority. With millions of players interacting in real-time, game servers are under enormous pressure to process a huge amount of data without latency (delays) or crashes. Research in the International Journal of Information and Communication Technology... Read more
-
Neuromorphic platform presents significant leap forward in computing efficiency
Researchers at the Indian Institute of Science (IISc) have developed a brain-inspired analog computing platform capable of storing and processing data in an astonishing 16,500 conductance states within a molecular film. Published today in the journal Nature, this breakthrough represents a huge step forward over traditional digital computers in which... Read more
-
Chip that steers terahertz beams sets stage for ultrafast internet of the future
Imagine a future where internet connections are not only lightning-fast but also remarkably reliable, even in crowded spaces. This vision is rapidly approaching reality, thanks to new research on terahertz communications technologies. These innovations are set to transform wireless communication, particularly as communications technology advances toward the next generation of... Read more
-
Silicon chip propels 6G communications forward
A team of scientists has unlocked the potential of 6G communications with a new polarization multiplexer. Terahertz communications represent the next frontier in wireless technology, promising data transmission rates far exceeding current systems.... Read more
-
Miniaturized brain-machine interface processes neural signals in real time
Researchers from EPFL have developed a next-generation miniaturized brain-machine interface capable of direct brain-to-text communication on tiny silicon chips.... Read more
-
How hardware contributes to the fairness of artificial neural networks
Over the past couple of decades, computer scientists have developed a wide range of deep neural networks (DNNs) designed to tackle various real-world tasks. While some of these models have proved to be highly effective, some studies found that they can be unfair, meaning that their performance may vary based... Read more
-
Air pollution in South Africa: Affordable new devices use AI to monitor hotspots in real time
Air quality has become one of the most important public health issues in Africa. Poor air quality kills more people globally every year than HIV, TB and malaria combined. And that's just the tip of the iceberg. Air pollution makes people less productive because they get headaches and feel tired.... Read more
-
Real-time crime centers are transforming policing—criminologist explains how advanced surveillance systems work
In 2021, a driver in Albuquerque, New Mexico, ran a red light, striking and killing a 7-year-old and injuring his father. The suspect fled the scene and eventually escaped to Mexico. Using camera footage and cellphone data, the Albuquerque Police Department's real-time crime center played a crucial role in identifying,... Read more
-
Thinking about the rise of brain-inspired computing
The recent widespread and long-lasting chaos caused by Microsoft outages across the globe exemplifies just how integral computing has become to our lives. Yet, as computer hardware and software improve, arguably the most sophisticated and powerful computer we know of is still the human brain.... Read more
-
Engineers develop magnetic tunnel junction–based device to make AI more energy efficient
Engineering researchers at the University of Minnesota Twin Cities have demonstrated a state-of-the-art hardware device that could reduce energy consumption for artificial intelligent (AI) computing applications by a factor of at least 1,000.... Read more
-
Neural network training made easy with smart hardware
Large-scale neural network models form the basis of many AI-based technologies such as neuromorphic chips, which are inspired by the human brain. Training these networks can be tedious, time-consuming, and energy-inefficient given that the model is often first trained on a computer and then transferred to the chip. This limits... Read more
-
Compact and scalable multiple-input multiple-output systems for future 5G networks
A 28GHz time-division multiple-input multiple-output (MIMO) receiver with eight radio frequency elements, each occupying just 0.1 mm2, has been developed by researchers at Tokyo Tech using 65nm CMOS technology. This innovative design reduces chip size for beamforming. Achieving -23.5 dB error vector magnitude in 64-quadrature amplitude modulation and data rates... Read more
-
A novel 640 Gbps chipset paves the way for next generation wireless systems
A new D-band CMOS transceiver chipset with 56 GHz signal-chain bandwidth achieves the highest transmission speed of 640 Gbps for a wireless device realized with integrated circuits, as reported by researchers from Tokyo Tech and National Institute of Information and Communications Technology. The proposed chipset is highly promising for the... Read more
-
Photonic chip integrates sensing and computing for ultrafast machine vision
Researchers have demonstrated a new intelligent photonic sensing-computing chip that can process, transmit and reconstruct images of a scene within nanoseconds. This advance opens the door to extremely high-speed image processing that could benefit edge intelligence for machine vision applications such as autonomous driving, industrial inspection and robotic vision.... Read more
-
Researchers develop Superman-inspired imager chip for mobile devices
Researchers from The University of Texas at Dallas and Seoul National University have developed an imager chip inspired by Superman's X-ray vision that could be used in mobile devices to make it possible to detect objects inside packages or behind walls.... Read more
-
Cutting-edge vision chip brings human eye-like perception to machines
With the rapid advancement of artificial intelligence, unmanned systems such as autonomous driving and embodied intelligence are continuously being promoted and applied in real-world scenarios, leading to a new wave of technological revolution and industrial transformation. Visual perception, a core means of information acquisition, plays a crucial role in these... Read more
-
What is the European sovereign cloud?
Billions of euros are flooding into the cloud industry in Europe, with US tech giants constructing vast data centers all around the continent.... Read more
-
AMD unveils new AI chips to challenge Nvidia
AMD on Monday announced its new artificial intelligence chips for everything from cutting-edge data centers to advanced laptops, ramping up its challenge to the runaway market leader Nvidia.... Read more
-
Foxconn eyes 40 percent global AI server market share
Taiwanese tech giant Foxconn said Friday its global market for artificial intelligence servers is expected to increase to 40 percent this year, with AI products being the main driver for growth.... Read more
-
Researchers develop large-scale neuromorphic chip with novel instruction set architecture and on-chip learning
The Spiking Neural Network (SNN) offers a unique approach to simulating the brain's functions, making it a key focus in modern neuromorphic computing research. Unlike traditional neural networks, SNNs operate on discrete, event-driven signals, aligning more closely with biological processes.... Read more
-
Researchers and industry partners demonstrate cutting-edge chip technology for ultra-low power AI connected devices
Researchers from NUS, together with industry partners Soitec and NXP Semiconductors, have demonstrated a new class of silicon systems that promises to enhance the energy efficiency of AI connected devices by leaps and bounds. These technological breakthroughs will significantly advance the capabilities of the semiconductor industry in Singapore and beyond.... Read more
-
A new lease on life for old laptops
Researchers in India have developed a tool that can estimate the remaining useful life of an otherwise obsolete laptop computer based on quality grading of two of its main components—hard drive and rechargeable lithium-ion battery.... Read more
-
Controlling chaos using edge computing hardware: Digital twin models promise advances in computing
Systems controlled by next-generation computing algorithms could give rise to better and more efficient machine learning products, a new study suggests.... Read more
-
A new, low-cost, high-efficiency photonic integrated circuit
The rapid advancement in photonic integrated circuits (PICs), which combine multiple optical devices and functionalities on a single chip, has revolutionized optical communications and computing systems.... Read more
-
Computer scientists discover vulnerability in cloud server hardware used by AMD and Intel chips
Public cloud services employ special security technologies. Computer scientists at ETH Zurich have now discovered a gap in the latest security mechanisms used by AMD and Intel chips. This affects major cloud providers.... Read more
-
Researchers propose design methodology for hardware Gaussian random number generators
A research team from the University of Science and Technology of China (USTC) of the Chinese Academy of Sciences (CAS) has proposed a novel design methodology for Gaussian random number (GRN) generators tailored for SerDes simulation systems.... Read more
-
Turning up the heat on data storage: New memory device paves the way for AI computing in extreme environments
A smartphone shutting down on a sweltering day is an all-too-common annoyance that may accompany a trip to the beach on a sunny afternoon. Electronic memory within these devices isn't built to handle extreme heat.... Read more