-
A pruning approach for neural network design optimized for specific hardware configurations
Neural network pruning is a key technique for deploying artificial intelligence (AI) models based on deep neural networks (DNNs) on resource-constrained platforms, such as mobile devices. However, hardware conditions and resource availability vary greatly across different platforms, making it essential to design pruned models optimally suited to specific hardware configurations.... Read more
-
How a national lab retires—and shreds—large computing resources
Ever wonder what happens to massive supercomputing systems when they're retired? Surprisingly, when it comes to the data, it's not too different from disposing of old documents—they go straight into a shredder and sent to recycling.... Read more
-
New load balancing method enhances multiplayer game performance
Online gaming is increasingly popular. As such, server efficiency is becoming an increasingly urgent priority. With millions of players interacting in real-time, game servers are under enormous pressure to process a huge amount of data without latency (delays) or crashes. Research in the International Journal of Information and Communication Technology... Read more
-
Neuromorphic platform presents significant leap forward in computing efficiency
Researchers at the Indian Institute of Science (IISc) have developed a brain-inspired analog computing platform capable of storing and processing data in an astonishing 16,500 conductance states within a molecular film. Published today in the journal Nature, this breakthrough represents a huge step forward over traditional digital computers in which... Read more
-
Chip that steers terahertz beams sets stage for ultrafast internet of the future
Imagine a future where internet connections are not only lightning-fast but also remarkably reliable, even in crowded spaces. This vision is rapidly approaching reality, thanks to new research on terahertz communications technologies. These innovations are set to transform wireless communication, particularly as communications technology advances toward the next generation of... Read more
-
Silicon chip propels 6G communications forward
A team of scientists has unlocked the potential of 6G communications with a new polarization multiplexer. Terahertz communications represent the next frontier in wireless technology, promising data transmission rates far exceeding current systems.... Read more
-
Miniaturized brain-machine interface processes neural signals in real time
Researchers from EPFL have developed a next-generation miniaturized brain-machine interface capable of direct brain-to-text communication on tiny silicon chips.... Read more
-
How hardware contributes to the fairness of artificial neural networks
Over the past couple of decades, computer scientists have developed a wide range of deep neural networks (DNNs) designed to tackle various real-world tasks. While some of these models have proved to be highly effective, some studies found that they can be unfair, meaning that their performance may vary based... Read more
-
Air pollution in South Africa: Affordable new devices use AI to monitor hotspots in real time
Air quality has become one of the most important public health issues in Africa. Poor air quality kills more people globally every year than HIV, TB and malaria combined. And that's just the tip of the iceberg. Air pollution makes people less productive because they get headaches and feel tired.... Read more
-
Real-time crime centers are transforming policing—criminologist explains how advanced surveillance systems work
In 2021, a driver in Albuquerque, New Mexico, ran a red light, striking and killing a 7-year-old and injuring his father. The suspect fled the scene and eventually escaped to Mexico. Using camera footage and cellphone data, the Albuquerque Police Department's real-time crime center played a crucial role in identifying,... Read more
-
Thinking about the rise of brain-inspired computing
The recent widespread and long-lasting chaos caused by Microsoft outages across the globe exemplifies just how integral computing has become to our lives. Yet, as computer hardware and software improve, arguably the most sophisticated and powerful computer we know of is still the human brain.... Read more
-
Engineers develop magnetic tunnel junction–based device to make AI more energy efficient
Engineering researchers at the University of Minnesota Twin Cities have demonstrated a state-of-the-art hardware device that could reduce energy consumption for artificial intelligent (AI) computing applications by a factor of at least 1,000.... Read more
-
Neural network training made easy with smart hardware
Large-scale neural network models form the basis of many AI-based technologies such as neuromorphic chips, which are inspired by the human brain. Training these networks can be tedious, time-consuming, and energy-inefficient given that the model is often first trained on a computer and then transferred to the chip. This limits... Read more
-
Compact and scalable multiple-input multiple-output systems for future 5G networks
A 28GHz time-division multiple-input multiple-output (MIMO) receiver with eight radio frequency elements, each occupying just 0.1 mm2, has been developed by researchers at Tokyo Tech using 65nm CMOS technology. This innovative design reduces chip size for beamforming. Achieving -23.5 dB error vector magnitude in 64-quadrature amplitude modulation and data rates... Read more
-
A novel 640 Gbps chipset paves the way for next generation wireless systems
A new D-band CMOS transceiver chipset with 56 GHz signal-chain bandwidth achieves the highest transmission speed of 640 Gbps for a wireless device realized with integrated circuits, as reported by researchers from Tokyo Tech and National Institute of Information and Communications Technology. The proposed chipset is highly promising for the... Read more
-
Photonic chip integrates sensing and computing for ultrafast machine vision
Researchers have demonstrated a new intelligent photonic sensing-computing chip that can process, transmit and reconstruct images of a scene within nanoseconds. This advance opens the door to extremely high-speed image processing that could benefit edge intelligence for machine vision applications such as autonomous driving, industrial inspection and robotic vision.... Read more
-
Researchers develop Superman-inspired imager chip for mobile devices
Researchers from The University of Texas at Dallas and Seoul National University have developed an imager chip inspired by Superman's X-ray vision that could be used in mobile devices to make it possible to detect objects inside packages or behind walls.... Read more
-
Cutting-edge vision chip brings human eye-like perception to machines
With the rapid advancement of artificial intelligence, unmanned systems such as autonomous driving and embodied intelligence are continuously being promoted and applied in real-world scenarios, leading to a new wave of technological revolution and industrial transformation. Visual perception, a core means of information acquisition, plays a crucial role in these... Read more
-
What is the European sovereign cloud?
Billions of euros are flooding into the cloud industry in Europe, with US tech giants constructing vast data centers all around the continent.... Read more
-
AMD unveils new AI chips to challenge Nvidia
AMD on Monday announced its new artificial intelligence chips for everything from cutting-edge data centers to advanced laptops, ramping up its challenge to the runaway market leader Nvidia.... Read more
-
Foxconn eyes 40 percent global AI server market share
Taiwanese tech giant Foxconn said Friday its global market for artificial intelligence servers is expected to increase to 40 percent this year, with AI products being the main driver for growth.... Read more
-
Researchers develop large-scale neuromorphic chip with novel instruction set architecture and on-chip learning
The Spiking Neural Network (SNN) offers a unique approach to simulating the brain's functions, making it a key focus in modern neuromorphic computing research. Unlike traditional neural networks, SNNs operate on discrete, event-driven signals, aligning more closely with biological processes.... Read more
-
Researchers and industry partners demonstrate cutting-edge chip technology for ultra-low power AI connected devices
Researchers from NUS, together with industry partners Soitec and NXP Semiconductors, have demonstrated a new class of silicon systems that promises to enhance the energy efficiency of AI connected devices by leaps and bounds. These technological breakthroughs will significantly advance the capabilities of the semiconductor industry in Singapore and beyond.... Read more
-
A new lease on life for old laptops
Researchers in India have developed a tool that can estimate the remaining useful life of an otherwise obsolete laptop computer based on quality grading of two of its main components—hard drive and rechargeable lithium-ion battery.... Read more
-
Controlling chaos using edge computing hardware: Digital twin models promise advances in computing
Systems controlled by next-generation computing algorithms could give rise to better and more efficient machine learning products, a new study suggests.... Read more
-
A new, low-cost, high-efficiency photonic integrated circuit
The rapid advancement in photonic integrated circuits (PICs), which combine multiple optical devices and functionalities on a single chip, has revolutionized optical communications and computing systems.... Read more
-
Computer scientists discover vulnerability in cloud server hardware used by AMD and Intel chips
Public cloud services employ special security technologies. Computer scientists at ETH Zurich have now discovered a gap in the latest security mechanisms used by AMD and Intel chips. This affects major cloud providers.... Read more
-
Researchers propose design methodology for hardware Gaussian random number generators
A research team from the University of Science and Technology of China (USTC) of the Chinese Academy of Sciences (CAS) has proposed a novel design methodology for Gaussian random number (GRN) generators tailored for SerDes simulation systems.... Read more
-
Turning up the heat on data storage: New memory device paves the way for AI computing in extreme environments
A smartphone shutting down on a sweltering day is an all-too-common annoyance that may accompany a trip to the beach on a sunny afternoon. Electronic memory within these devices isn't built to handle extreme heat.... Read more
-
Google plans to invest $2 billion to build data center in northeast Indiana, officials say
Google plans to invest $2 billion to build a data center in northeastern Indiana that will help power its artificial intelligence technology and cloud business, company and state officials said Friday.... Read more
-
New research demonstrates potential of thin-film electronics for flexible chip design
The mass production of conventional silicon chips relies on a successful business model with large "semiconductor fabrication plants" or "foundries." New research by KU Leuven and imec shows that this "foundry" model can also be applied to the field of flexible, thin-film electronics. Adopting this approach would give innovation in... Read more
-
Researchers develop tiny chip that can safeguard user data while enabling efficient computing on a smartphone
Health-monitoring apps can help people manage chronic diseases or stay on track with fitness goals, using nothing more than a smartphone. However, these apps can be slow and energy-inefficient because the vast machine-learning models that power them must be shuttled between a smartphone and a central memory server.... Read more
-
Researchers develop energy-efficient probabilistic computer by combining CMOS with stochastic nanomagnet
Researchers at Tohoku University and the University of California, Santa Barbara, have unveiled a probabilistic computer prototype. Manufacturable with a near-future technology, the prototype combines a complementary metal-oxide semiconductor (CMOS) circuit with a limited number of stochastic nanomagnets, creating a heterogeneous probabilistic computer.... Read more
-
Taichi: A large-scale diffractive hybrid photonic AI chiplet
A combined team of engineers from Tsinghua University and the Beijing National Research Center for Information Science and Technology, both in China, has developed a large-scale diffractive hybrid photonic AI chiplet for use in high-efficiency artificial general intelligence applications. Their paper is published in the journal Science.... Read more
-
RVAM16: A low-cost multiple-ISA processor based on RISC-V and ARM thumb
The increasing demand in the embedded field has led to the emergence of several impressive Instruction Set Architectures (ISAs). However, when processors migrate from one ISA to another, software compatibility issues are unavoidable.... Read more
-
Zap! California startup touts its new battery technology as a fast-charging 'universal adapter'
Officials at a startup based in Carlsbad, California, expect a battery technology they have engineered will transform the way e-bikes and electric-powered hand-held tools are charged. And once it's scaled up, they believe the technology will reshape even more sectors of the economy.... Read more
-
Computer scientists discover gap in the latest security mechanisms used by some chips
Over the past few years, hardware manufacturers have developed technologies that ought to make it possible for companies and governmental organizations to process sensitive data securely using shared cloud computing resources. Known as confidential computing, this approach protects sensitive data while it is being processed by isolating it in an... Read more
-
Proof-of-principle demonstration of 3D magnetic recording could lead to enhanced hard disk drives
Research groups from NIMS, Seagate Technology, and Tohoku University have made a breakthrough in the field of hard disk drives (HDD) by demonstrating the feasibility of multi-level recording using a three-dimensional magnetic recording medium to store digital information.... Read more
-
Researchers develop a novel ultra-low–power memory for neuromorphic computing
A team of Korean researchers has developed a new memory device that can be used to replace existing memory or be used in implementing neuromorphic computing for next-generation artificial intelligence hardware for its low processing costs and its ultra-low–power consumption.... Read more
-
Researchers 3D print key components for a point-of-care mass spectrometer
Mass spectrometry, a technique that can precisely identify the chemical components of a sample, could be used to monitor the health of people who suffer from chronic illnesses. For instance, a mass spectrometer can measure hormone levels in the blood of someone with hypothyroidism.... Read more
-
Flexible microspectrometer for mobile applications
Researchers at the Fraunhofer Institute for Applied Optics and Precision Engineering IOF have developed a very compact spectrometer module. It maps spectra from 39 optical fibers onto one camera sensor in a small space. This is made possible by a special micro-optical system. The technology, which has potential for applications... Read more
-
A helmet with a vibration sensor for excavator drivers
Fraunhofer researchers have developed a helmet with an integrated acceleration sensor for drivers of construction vehicles. The helmet sensor measures harmful vibrations that affect the body. The software analyzes the sensor signals and shows the stress on the affected person. This allows corresponding relief measures to be taken. A flexible... Read more
-
A new strategy for fabricating high-density vertical organic electrochemical transistor arrays
Organic electrochemical transistors (OECTs) are an emerging class of transistors based on organic superconducting materials known for their ability to modulate electrical current in response to small changes in the voltage applied to their gate electrode. Like other electronics based on organic semiconductors, these transistors could be promising for the... Read more
-
Research team develops next-generation semiconductor memory that operates in extreme environments
Researchers have developed a new manufacturing technology that enables the production of high-quality oxide films and effective patterning at low temperatures and manufactured non-volatile resistive random access memory. It is expected to be used in next-generation computing systems by overcoming the shortcomings of existing manufacturing technologies and developing memories with... Read more
-
Amazon bets $150 billion on data centers required for AI boom
Amazon.com Inc. plans to spend almost $150 billion in the coming 15 years on data centers, giving the cloud-computing giant the firepower to handle an expected explosion in demand for artificial intelligence applications and other digital services.... Read more
-
Climate change puts global semiconductor manufacturing at risk. Can the industry cope?
Semiconductors are the basic building blocks of microchips. These technological marvels are in everything from lightbulbs and toothbrushes to cars, trains and planes, not to mention the vast array of electronics that have become integral to many people's daily lives.... Read more
-
Semiconductors at scale: New processor achieves remarkable speedup in problem solving
Annealing processors are designed specifically for addressing combinatorial optimization problems, where the task is to find the best solution from a finite set of possibilities. This holds implications for practical applications in logistics, resource allocation, and the discovery of drugs and materials.... Read more
-
Thin, bacteria-coated fibers could lead to self-healing concrete that fills in its own cracks
Some say there are two types of concrete—cracked and on the brink of cracking. But what if when concrete cracked, it could heal itself?... Read more
-
Nvidia unveils higher performing 'superchips'
Nvidia on Monday unveiled its latest family of chips for powering artificial intelligence, as it seeks to consolidate its position as the major supplier to the AI frenzy.... Read more
-
What we know so far about the rumored Apple smart ring
Samsung officially announced the launch of a new smart ring-shaped wearable device, Galaxy Ring, as part of its Galaxy Unpacked event earlier this year. The ring, expected to be on sale in late summer 2024, will be able to monitor the user's health parameters and provide insights based on the... Read more