New AI model Dev-ResNet accurately monitors embryonic development in real-time

NewsGuard 100/100 Score

Research led by the University of Plymouth has shown that a new deep learning AI model can identify what happens and when during embryonic development, from video.

Published today (Wednesday 29 May) in the Journal of Experimental Biology, the study highlights how the model, known as Dev-ResNet, can identify the occurrence of key functional developmental events in pond snails, including heart function, crawling, hatching and even death.

A key innovation in this study is the use of a 3D model that uses changes occurring between frames of the video, and enables the AI to learn from these features, as opposed to the more traditional use of still images.

The use of video means features ranging from the first heartbeat, or crawling behaviour, through to shell formation or hatching are reliably detected by Dev-ResNet, and has revealed sensitivities of different features to temperature not previously known.

While used in pond snail embryos for this study, the authors say the model has broad applicability across all species, and they provide comprehensive scripts and documentation for applying Dev-ResNet in different biological systems.

In future, the technique could be used to help accelerate understanding on how climate change, and other external factors, affect humans and animals.

The work was led by PhD candidate, Ziad Ibbini, who studied BSc Conservation Biology at the University, before taking a year out to upskill himself in software development, then beginning his PhD. He designed, trained and tested Dev-ResNet himself.

He said: "Delineating developmental events – or working out what happens when in an animal's early development – is so challenging, but incredibly important as it helps us to understand changes in event timing between species and environments.

"Dev-ResNet is a small and efficient 3D convolutional neural network capable of detecting developmental events using videos, and can be trained relatively easily on consumer hardware.

"The only real limitations are in creating the data to train the deep learning model – we know it works, you just need to give it the right training data.

"We want to equip the wider scientific community with the tools that will enable them to better understand how a species' development is affected by different factors, and thus identifying how we can protect them. We think that Dev-ResNet is a significant step in that direction."

This research is important on a technological level, but it is also significant for advancing how we perceive organismal development – something that the University of Plymouth, within the Ecophysiology and Development research Group, has more than 20 years' history of researching.

This milestone would not have been possible without deep learning, and it is exciting to think of where this new capability will lead us in the study of animals during their most dynamic period of life."

Dr. Oli Tills, paper's senior author and a UKRI Future Leaders Research Fellow

Source:
Journal reference:

Ibbini, Z., et al. (2024) Dev-ResNet: automated developmental event detection using deep learning. Journal of Experimental Biology. doi.org/10.1242/jeb.247046.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
AI revolutionizes malaria diagnosis with 97.57% accuracy using EfficientNet