Pittsburgh research team studies big brain data for complex brain disorders

NewsGuard 100/100 Score

The National Science Foundation BIGDATA program awarded $1,200,000 to a research team led by the University of Pittsburgh Swanson School of Engineering to study the big brain data for complex brain disorders and design new algorithms that address computational challenges in multi-site collaborative data mining.

Heng Huang, the Swanson School's John A. Jurenko Professor of Computer Engineering, is principal investigator of the study, "Asynchronous Distributed Machine Learning Framework for Multi-Site Collaborative Brain Big Data Mining." Huang currently leads seven NSF projects and an NIH R01 project on machine learning, big data mining, computational neuroscience, health informatics, and precision medicine.

"Research in emerging fields, such as brain imaging genomics and human connectomics, holds great promise for a systems biology study of the brain," said Huang. "This research can help us better understand complex neurobiological systems, from genetic determinants to the interplay between brain structure, connectivity, function, and cognition."

While researchers currently have access to brain data collected from a series of funded projects, they have failed to attain additional data collected by different local institutes due to data privacy and security issues preventing cross-institutional distribution. In this project, Huang will create a framework to address these issues and facilitate data and computing resource sharing.

"In collaborative data analysis, the participating institutes keep their own data, which are analyzed and computed locally, and only share the computed results via communicating with the server," explained Huang. "The server communicates with all institutes and updates the computational model such that the trained machine learning models indirectly use all data and are shared to all institutes."

According to Huang, most machine learning algorithms were not designed for such distributed architecture due to difficulties in designing efficient algorithms and providing theoretical foundations. This is the first project to create these type of algorithms for the study of brain imaging genomics and human connectomics. The goal of this project is to alleviate these computational challenges and enable investigators in neuroimaging, genomics, neuroscience, and other brain-related disciplines to securely and more efficiently further their research.

"The result of our project will be new distributed machine learning algorithms with theoretical foundations that can be used for multi-site collaborative big brain data mining, creating large-scale computational strategies and effective software tools," said Huang. "We hope that this work will help researchers harness the full potential of big brain data, potentially leading to the next major brain science discoveries."


The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
High-tech imaging and VR used for mapping the brain and understanding developmental disorders