Tag: Research Collaboration

  • Is This Drone the Future of Wilderness Rescue?

    Is This Drone the Future of Wilderness Rescue?

    Roughly 11 million people visit Hong Kong’s country parks each year. Most do so safely, returning home with stories and photos of the stunning beauty of nature. But what about the occasional hiker who wanders off the beaten path and into danger?

    Although rare, such instances are more common than you might think. Despite being one of the densest urban centres in the world, some 40% of Hong Kong’s land is occupied by sometimes rugged, mountainous country parks. 

    The size of these parks, combined with their dense vegetation and tree cover, poses significant challenges to search and rescue efforts: In 2023, a high schooler was found alive after a weeklong search of Ma On Shan Country Park – an effort that saw rescuers try everything from dogs to aerial photography.

    But what if there was another option? Intrigued by the challenges involved in using drones for search and rescue, Weiying Hou, a PhD student in Professor Chenshu Wu’s lab at the University of Hong Kong, has come up with a novel solution: outfitting an off-the-shelf drone with a device known as a Luneburg lens, allowing it to home in on and connect to the victim’s Wi-Fi signal from hundreds of metres away, regardless of forest cover. 

    If that sounds simple, it’s not. The breakthrough took years of hard work and tinkering to achieve, as Hou, Wu, and another PhD student, Luca Jiang-Tao Yu, reengineered network cards, mastered the intricacies of 3D printing, and learned more than they ever thought they’d need to know about how to pilot a drone. 

    Now that work is paying off: The team’s device has been accepted by the prestigious MobiCom 2026 conference. Perhaps more importantly to Hou and the team, it’s also attracted interest from wildlife search and rescue groups.

    “If more people can be saved using our technology, our hard work will have been worth it,” Hou says.

    Hear from Professor Wu and the team as they talk about their hopes for the drone.

    Humble beginnings

    While the Wi-Fi drone may seem like a natural fit for Professor Wu’s HKU AIOT Lab, the original outline of the idea came, not from an engineer, but an undergraduate at the University’s School of Nursing. 

    The student, who was active in wilderness search and rescue groups, had tried cobbling together their own version using store-bought parts, but lacked the know-how and technical vision to make it work. After the idea was referred to HKU’s Inno Wing, which aims to give engineering students the resources and support needed to explore ideas with potential real-world applications, it eventually made its way to Professor Wu’s lab.

    At first glance, the concept was straightforward – and a good match for Professor Wu’s expertise in the Internet of Things. The drone would be programmed to mimic a missing person’s home Wi-Fi network – a unique signal that simplifies identification and does not require line of sight. Once a connection is established, the drone could quickly triangulate the phone’s position and move overhead.

    The devil was in the details, however. In order for the drone to be effective, it needed to be able to search for a signal in all directions. Hou started by designing a cone – similar to a satellite dish – but it couldn’t provide the 360° coverage he needed. He would need to restart his search.

    “This was before ChatGPT,” he recalls with a laugh.

    • Professor Chenshu Wu of CDS explains how the wilderness rescue drone works
    • The team poses for a photo.
    • The team poses for a photo with the wilderness rescue drone.

    Play ball

    After six months of research, Hou found the solution on Wikipedia: a Luneburg lens, a spherical device that would allow the drone to pinpoint signals regardless of what direction it was facing. 

    That was just the beginning, according to Professor Wu. His lab had no experience working with Luneburg lenses, and Hou had to learn everything about them from scratch. “We were like primary school students,” Professor Wu recalls. 

    First came a crash-course in 3D printing, so Hou could prototype his lens. Then there were engineering problems to tackle: The lens had to be light, to avoid draining the drone’s battery. It also needed to be able to pinpoint a signal instantly, in case the victim’s phone was dying, and then feed the directional data back into the drone controller to form a closed control loop. And how do you attach a large sphere to a drone without impacting its flight-worthiness? 

    Other challenges followed. Network cards had to be reprogrammed to work in concert – a job Professor Wu called “not intellectually challenging, but extremely tedious” – and a motherboard programmed to operate them.

    Finally, the team had a working prototype. But when they sent it up into the sky, they received another nasty shock: the drone crashed on its very first test flight.

    Working the problem

    The failure, which Professor Wu estimates set them back HKD60,000, is just part of the engineering process, he says. 

    The team quickly rebuilt their device atop a bigger, more reliable drone. This time, the flight went smoothly, validating Hou’s years of work. In the months since, their design has been accepted by MobiCom and tested by mountain search and rescue teams in Hong Kong. Hou also won a presentation award at the Second Low-Altitude Summit.

    The next step is upgrading the drone to support more wireless signals and dedicated tags. What won’t change is the team’s ethos: Everything is open-source, and Hou’s long-term goal is for mountain rescue teams around the world to be able to hack together their own versions using readily available components and free software.

    “Everything is designed to be easy to implement,” he notes. “The lens can be replicated by anyone with a commercial 3D printer.”

    For Wu, the drone’s success could open new pathways in the low-altitude economy, a key initiative for Hong Kong in the coming years, while the underlying tech has exciting potential applications in ocean monitoring, drone tracking, and other fields.

    It’s also a validation of his lab’s approach to engineering. “We work on real problems,” he says. “In engineering, there’s an emphasis on novelty, but you always need to be thinking about how your work translates to the real world. What is the problem you’re solving? That’s what resonates.”

    Hou agrees. “A good research problem is a real problem,” he says. While he acknowledges the drone needs some tweaks, he remains committed to his original mission.

    “Hiking is extremely popular in Hong Kong, and sometimes people get lost,” he says. “I just wanted to make hiking in Hong Kong safer.”

  • How AI Is Rewriting the Rules of Art Conservation

    How AI Is Rewriting the Rules of Art Conservation

    “It opens up all these new horizons for art history, for connoisseurship, and for how the discipline is going to continue to form.” 
    — Professor Marc Walton 

    You’ve heard of using AI to make art, but an interdisciplinary team of researchers at the University of Hong Kong is now tackling a far more complex problem: applying AI to the field of art conservation. Their work could have outsize ramifications for the world’s art institutions, expanding access to cutting-edge art conservation tools, cutting the time needed for materials analysis and allowing even small museums to protect and preserve their collections. 

    A Hands-On Approach 

    Tucked away in a corner of the Hong Kong University Museum and Art Gallery is the only on-site, university museum research lab in Asia. There, a team of chemists, conservation scientists, and students under Professor Marc Walton of HKU’s Museum Studies programme and the Department of Chemistry’s Dr Kenneth Ng are developing, building, and experimenting with new instrumentation that could radically lower the barriers to characterising the materials comprising objects of art and archaeology. 

    Looking around the lab, it’s hard to imagine that, a little over a year ago, almost none of this infrastructure existed. Before Walton, who was previously Head of Conservation and Research at M+, joined HKU in 2024, he had never met Dr Ng. They were brought together by another new arrival to the university, Chemistry Professor Jay Siegel, who recognised the pair’s shared interest in both chemistry and mechanical tinkering. 

    For Professor Siegel, the collaboration offered a solution to two longstanding issues in university education: how to break down barriers between disciplines and give students hands-on experience with real-world applications. 

    “(The students) are very well trained, they know their theories, but they’ve never touched an artefact before.” 
    — Dr Kenneth Ng 

    Soon, what began as a series of informal conversations morphed into something very real, with Walton joining HKU, then teaming up with Ng and the Chemistry department to bring their vision to life. “Jay recognised that Kenneth and I were thinking along the same lines,” says Walton. “He couldn’t have been more correct. This is the type of cross-fertilisation you normally wouldn’t think about: bringing a chemist together with someone from the humanities.” 

    Bridging Art and Science 

    Viewed from the front, the UMAG lab’s“Franken-camera” doesn’t look particularly unusual. It’s only when you circle around back that the moniker’s logic starts to come into focus, revealing uncovered wires and chips that have been grafted on by the team to improve performance. 

    The Franken-camera is far from the only curious-looking tool in the lab. The team uses a wide variety of analysis instruments, from off-the-shelf items like the handheld XRF spectrometer – designed to mimic the look of a Star Trek phaser – and modified microscopes. Their commitment to home-brewed tech isn’t just about performance or customisability; by opting to custom build tools, they give students more opportunities to practice a variety of skills that could be useful for their futures, such as the integration of code with hardware. 

    “The one big hurdle that we face in teaching is that ‘fancy’ microscopes are usually very, very expensive and ‘thou shalt not touch it.’” 

    — Dr Kenneth Ng 

    One of the biggest new tools in their toolkit is artificial intelligence. While the underlying machine-learning tech has been around for decades, the proliferation of LLMs and AI applications is drastically shortening the time spent on materials analysis – a key step in the process of understanding and conserving a piece of art. 

    As an example, Walton points to the traditionally time-consuming task of point analysis of artefacts. AI is allowing the team to take a handful of data points – some with detailed spectroscopic information, others that cover a larger portion of the artefact and show spatial details – and merge the sets to produce an image cheaply and quickly. 

    Dr Kenneth Ng inspects one of the lab's instruments

    Expanding Access Through AI

    “AI allowed us to do that. Before, it was really difficult to be able to fuse these different things together, to be able to create something that combines the best of both worlds.” 
    — Professor Marc Walton 

    Both Walton and Ng spotlight AI’s impact in expanding access to art conservation. Traditional conservation characterisation methods are expensive and difficult to use, leaving institutions across the globe struggling to balance their desire to augment the value of their collections through study and treatment with the associated costs of bringing science into the museum.  

    If the cost of conservation tools and processes could be brought down, the thinking goes, then many of these problems could be solved with some basic technical knowledge and a little ingenuity.  

    “These are things that any museum around the world, any person that’s interested in duplicating our work, could conceivably be able to do it without spending a whole lot of money,” says Walton. 

    A student uses an AI-powered tool to perform analysis on a sample work of art.

    The Future as Blank Canvas  

    The team cautions that AI isn’t a cure-all, and that many of the classical methods of conservation continue to work well. Rather than completely overturning the field’s received knowledge, they’re focused on teaching students how to develop and use new tools while maintaining a critical mindset. 

    “It’s very important for students to know the nuts and bolts of AI rather than just using it as a black box,” says Ng, using a common metaphor for the opacity of AI algorithms. “Especially as a scientist, you really need to put on that scientist hat and differentiate whether it’s hallucinating, or whether it’s giving you the right answers.” 

    Still, they remain excited about the tech’s potential for conservation, with Walton pointing to possibilities, not just for the museum world, but also for lowering the barriers to connoisseurship and changing the field of art history. 

    That’s not all: Asked whether AI can bridge the gap between objective and subjective analysis, Walton pauses for a moment before turning philosophical. “I always think the objective and subjective come together, because the agency of the artist is in the materials, which we can be objective about” he says. “But really, what we want to understand is the subjective part of it. So, we’re using science as a tool to assess the subjective.” 

    To learn more about how Professor Walton and Dr Ng are using AI to rewrite the rules of art conservation, watch the video below:

  • Can Satellites Point the Way to More Liveable, Sustainable Cities?

    Can Satellites Point the Way to More Liveable, Sustainable Cities?

    In just 40 years, China went from a predominantly rural society to a highly urbanised one. What happened? And what lessons can other urbanising regions around the globe draw from that experience? 

    University of Hong Kong Professor of landscape architecture Bin Chen believes we can find the answers in satellite data. An expert in remote sensing and Director of HKU’s Future Urbanity and Sustainable Environment (FUSE) lab, he’s using satellite imagery to track everything from the historical growth of cities to emerging problem areas like urban heat, air pollution, and green/blue space loss. 

    It’s a cutting-edge field – and one with implications far beyond China. Done correctly, it could provide one of the first-ever windows into how cities grow and evolve in the real world. This emerging “urban intelligence” will in turn have major ramifications for rapidly urbanising countries across the Global South, allowing them to avoid mistakes made by other cities while empowering them to create more equitable built environments for all residents.  

    We asked Professor Chen to share his thoughts on the nature of his work, the importance of remote sensing and AI to urban planning, and how the world can build more equitable cities. 

    Professor Bin Chen points to some of his research on satellites and urban intelligence
    Professor Bin Chen points to some of his research on satellites and urban intelligence

    The “intangibles” of sustainability 

    Professor Chen characterises his work as looking at the “tangibles and intangibles” of the built environment.  

    The tangibles are relatively straightforward. But the intangibles – air, heat, sunlight, light pollution, shade, and noise – play just as important a role in our lives and health, while being much harder to quantify. 

    That’s where satellites can play a role, Professor Chen says, as new data and methods allow researchers to look at cities on a granular level, identifying problematic heat islands, light pollution, and air quality. 

    “It’s not just about climate. It’s also about liveability.” 

    – Professor Bin Chen

    Importantly, these are also problems of equity and ESG, with poorer residents often having to go without access to parks, shade, and clean air.  

    “Take shade, for example,” Professor Chen says. “In a city of high-rises, where you have people living in subdivided units with no access to sunlight year-round, shade is a social issue.”  

    Even if every building in a given district meets regulations, it can be hard to tell what the overall outcome will be – a problem satellite imagery and AI modelling can help solve

    While Professor Chen’s team can identify the issue, he hopes work on the solutions will be an interdisciplinary affair. “We need to bring experts together,” he says. “We need knowledge from social economics, environmental studies, data science, and urban planning.” 

    Professor Bin Chen poses with a sign for the Department of Landscape Architecture and HKU
    Professor Bin Chen poses with a sign for the Department of Landscape Architecture and HKU

    Urban intelligence 

    The long-term goal is to develop what Professor Chen calls “urban intelligence” – a deeper understanding how people, cars, buildings, and the environment interact – then apply that knowledge to developing regions around the world. 

    Perhaps surprisingly, given their importance in modern life, the actual mechanisms by which cities grow are not always well understood. It wasn’t until 2008, when satellite data became more widely accessible, and 2015, when large-scale cloud computing and machine learning caught up, that researchers could closely examine the growth of modern cities.  

    “AI makes everything more efficient.” 

    – Professor Bin Chen

    Professor Chen points to Shanghai’s Pudong New Area as a classic example, noting that while policy documents offer a window into its growth, remote sensing technology allows researchers a seamless, transparent view of how the district grew into a global financial capital. 

    Leading the way 

    But taking advantage of these advances will require more – and more open – data.  

    Still, researchers can make significant progress via virtual collaborations. A few years ago, Professor Chen worked with HKU Professor Peng Gong and partners from 23 universities and institutes across China on a database of Chinese urban land use.  

    By mixing satellite data with on-the-ground verification, they were able to create the country’s first nationwide parcel-level essential urban land use categories map, allowing researchers to easily compare cities around China, from Beijing to Shenzhen and Wuhan. 

    “Remote sensing is becoming more and more powerful. We used to have to focus on individual cities, but now we can look at entire countries, even the whole globe.” 

    – Professor Bin Chen

    That empowered numerous follow-up studies, and the team hopes to expand their map to the rest of the world in the coming years.  

    “We’re entering a new stage, going beyond remote sensing with multimodal data,” says Professor Chen. “But pushing the field forward will require more data sharing.” 

    • Professor Bin Chen, an expert on satellites, remote sensing, and urban intelligence, speaks to a student
    • Professor Bin Chen, an expert on satellites, remote sensing, and urban intelligence, poses in front of a sign for his Future Urbanity and Sustainable Environment Lab
    • Professor Bin Chen, an expert on satellites and urban intelligence, poses with some of his awards
  • Teaching Machines to Think Quantumly: Qi Zhao on the Frontier of AI-Driven Computing

    Teaching Machines to Think Quantumly: Qi Zhao on the Frontier of AI-Driven Computing

    “Quantum computers don’t just calculate. They learn from the rules of nature itself.” — Prof. Qi Zhao

    Artificial intelligence is everywhere — in our phones, our cities, and the tools we use to think.

    But for Professor Qi Zhao at The University of Hong Kong’s School of Computing and Data Science (CDS), the next leap in AI may come from a place far smaller than any silicon chip.

    His research explores how quantum physics and machine learning can work together to create a new kind of intelligence — one that learns the way the universe learns.


    From Theory to Computation

    Zhao trained as a quantum information theorist, studying how data behaves when stored in particles rather than bits.
    At HKU CDS, he leads a group that builds hybrid computing models combining classical algorithms with quantum processors.

    His goal is simple to state but hard to achieve: use quantum systems to make AI faster, smarter, and more energy-efficient.

    “Classical computers follow fixed paths,” he explains. “Quantum computers can explore many paths at once. That difference changes how learning works.”


    Reimagining Computation

    Traditional AI trains neural networks through repetition — adjusting parameters until patterns emerge.
    Quantum computers take a different approach.

    They rely on variational quantum algorithms, where a small quantum circuit learns by tuning itself with help from a classical controller.

    Think of it as teamwork: the quantum part handles exploration; the classical part handles evaluation. Together, they solve problems that would take ordinary machines far longer to compute. Zhao’s team studies how this cooperation could transform optimization tasks, from image recognition to material design.


    Quantum Machine Learning in Action

    Inside his lab, AI helps control fragile quantum hardware.

    Algorithms adjust pulse shapes, timing, and temperature to keep qubits stable. The system learns which conditions produce reliable results and adapts automatically when the environment changes. “It’s feedback learning in the truest sense,” Zhao says. “The machine is teaching itself how to stay coherent.”

    These experiments do more than improve performance. They show how AI and quantum physics can enhance each other.
    AI stabilizes quantum devices; quantum mechanics gives AI new mathematical tools for creativity and pattern discovery.


    Learning from Quantum Data

    Zhao believes that the next revolution will come when AI no longer just analyzes quantum data — it learns inside quantum data.

    His group explores models where quantum systems perform the learning directly, finding relationships hidden from classical logic.
    Such systems might recognize molecular structures or financial correlations beyond human intuition.

    “This is where AI stops imitating intelligence,” he explains. “It begins to share it.”


    Mentorship and Collaboration at CDS

    As a mentor, Zhao encourages students to cross boundaries between physics and computer science.
    He collaborates closely with Prof. Giulio Chiribella, Prof. Yuxiang Yang, and Prof. Ravi Ramanathan, creating a bridge between theory, experiment, and data science.

    In class, he simplifies complex formulas into visual intuition. His students learn not only to code algorithms but also to think about why an algorithm works. “The most exciting discoveries,” he says, “often happen when we try to explain them simply.”


    Looking Ahead: The Shape of Quantum Intelligence

    Zhao imagines a future where AI systems powered by quantum hardware design drugs, manage energy grids, or simulate ecosystems in real time.

    These machines will not replace human reasoning; they will extend it.

    “Intelligence isn’t just logic,” he says. “It’s the ability to learn from limited information. That’s what quantum mechanics has been doing for billions of years.”

    In his view, teaching machines to think quantumly is not just about computation — it’s about understanding learning itself.
    And at HKU CDS, that journey has already begun.