top of page

CUHK AI technology navigates microrobot swarm in complex environments inside human body


The research team led by Professor Li Zhang (middle) in collaboration with Professor Qi Dou (second on the right). Other team members include: Mr. Jialin Jiang (first on the left), Dr. Lidong Yang (second on the left) and Dr. Xiaojie Gao (first on the right).


Bees are extremely intelligent insects with the ability to communicate with the rest of their swarm to orchestrate their collective movement in complex environments. The Chinese University of Hong Kong (CUHK)’s engineering team has recently built an artificial intelligence (AI) navigation system that can make millions of microrobots behave like a bee swarm, autonomously reconfiguring their motion and distribution according to environmental changes, such as going around obstacles inside a human body. The findings have been reported in Nature Machine Intelligence, and bring clinical applications of microrobots a step closer.



Microrobots have been proposed as a medium for targeted drug delivery inside the body, in particular narrow and confined spaces or hard-to-reach tissues. Thousands or even millions of microrobots aggregated together are required to perform such tasks due to the limited capacity and functional capabilities of each individual, and the swarm is usually controlled by an external magnet or electromagnet. However, facing complicated and changing environments in the human body, such as fluid with contrasting characteristics, winding tubes and branches, the difficulties in manually manipulating microrobots are huge, and so is the chance of task failure.


Professor Li Zhang from CUHK’s Department of Mechanical and Automation Engineering said, “Collective movement in schools of fish and flocks of birds frequently involve them switching their shapes and structures to adapt to different situations and environments, such as to avoid terrain or obstacles, or attack an enemy. We have yet to find a way to make microrobots intelligent, but we can use an AI controlling system to externally manipulate their collective motion, making sure they do not get lost and stuck inside our bodies.”


With deep learning algorithms integrated with years of research data on microrobot navigation, Professor Zhang’s team, in collaboration with Professor Qi Dou’s group from CUHK’s Department of Computer Science and Engineering, has developed an AI system that realises the automation of the throngs. The system obtains vision from imaging tools such as ultrasound and X-ray fluoroscopy to help identify obstacles inside the body and plan in real-time the best possible route for the delivery of microrobots (Video A shows the AI system adjusting the path of the microrobot swarm). It can also control magnets or electromagnets to navigate the swarm and change its formation to increase its success rate in reaching its destination.



The effectiveness and reliability of the microrobotic AI navigation system were examined in a virtual placenta, simulating the complex blood vessel structure that microrobots may face. (Video B shows a microrobot swarm navigating in a web of tunnels towards a destination.)



The research team has further proposed an autonomous framework for microrobot swarms as a basis for future studies. The framework consists of five levels, from 0 to 4, with each one indicating increasing autonomy. The autonomy is realised externally, and the requirement of key external system components is different for each level. Level 1 is defined as automated swarm control in static environments which only required a magnet control and an imaging system, while a microrobot swarm with level 4 autonomy can navigate autonomously without human intervention when the field of view (FOV) controls and AI system are equipped.

Professor Zhang envisages that the AI navigation system will one day enable surgeons to deploy microrobots for therapeutic applications such as targeted drug delivery in the human body without specialised training.

4 views0 comments

Comments


subscribe_button.png

2023 @ Inno-Thought and its affiliates. All rights reserved.

bottom of page