New York: A team of US researchers, including one of Indian-origin, is developing smartphone technologies that will use current 5G and emerging 6G wireless systems to “view” and map the surrounding environment.
The mapping of a user’s surroundings will improve communication by identifying obstacles that may interfere with radio frequency signals.
“Mobile wireless devices have gradually transformed from mere communications devices into powerful computing platforms with a multitude of sensors, such as cameras and radars,” said Harpreet Dhillon of Virginia Tech’s College of Engineering.
In fact, when we shop for a new phone, the main considerations are its camera quality, processing speed, memory, and sensors, whereas hardly anyone checks its frequency bands.
“Since this is one of the first efforts to do a comprehensive analysis of vision-guided wireless systems, this is expected to have a significant impact on future generations of wireless,” Dhillon said in a statement.
Professors Walid Saad and Dhillon of the College of Engineering have been awarded a $1 million grant to develop vision-guided wireless systems.
Researchers at the Bradley Department of Electrical and Computer Engineering (ECE) want to take the multi-use function of cellular devices even further with the idea of a “vision-guided wireless system.”
“In 6G, we talk about high-frequency bands like terahertz,” said Saad.
These high frequencies can deliver high rates and high bandwidth, but the problem is that the signals are susceptible to blockages — much more so than low frequencies.
“Those frequencies can be blocked by things like your arms moving, or someone standing in a room with you,” Saad added.
Although these blockages might seem irritating, they could actually be the key to helping researchers improve communication.
“If a communication system fails because the signal is blocked, at sub-THz bands, we can still use that information to sense the environment and know that there was an obstacle in the first place,” Saad noted.
“Then, with both situational awareness and other side information – like a picture of the room – we can use that multimodal data to communicate better.”
These vision-guided systems can be used in several applications, such as enhancing the performance of tomorrow’s wireless systems (6G), creating a more advanced and interactive gaming environment, and pushing the boundaries of extended reality while exploring the metaverse.
–IANS