Alexandria Digital Research Library

Expeditions in Neurocartography: Mappings between Structural and Functional Pathways in Artificial and Cognitive Neural Systems

Author:
Hermundstad, Ann M.
Degree Grantor:
University of California, Santa Barbara. Physics
Degree Supervisor:
Jean Carlson
Place of Publication:
[Santa Barbara, Calif.]
Publisher:
University of California, Santa Barbara
Creation Date:
2012
Issued Date:
2012
Topics:
Physics, Theory, Biology, Neuroscience, and Biophysics, General
Keywords:
Neuroimaging
Neural Network Models
Structure-Function Relationships
Genres:
Online resources and Dissertations, Academic
Dissertation:
Ph.D.--University of California, Santa Barbara, 2012
Description:

Neural systems are inherently complex and dynamic in nature, exhibiting a vast array of functions that range from low-level cellular interactions to high-level cognitive processes. These functions are supported by anatomical interactions that span a similarly wide range of scales, from the synapses between individual neurons to the extended fiber pathways that traverse the brain. Both this structural architecture and the function that it supports are continually interacting with and adapting to the external environment. An understanding of both the capabilities and limitations of neural systems therefore requires integrative approaches for assessing interactions between structural architecture, dynamic functional activity, and environmental variability.

In this dissertation, we apply theoretical, computational, and data-driven techniques to the study of both artificial and cognitive neural systems with the goal of mapping between patterns of structural and functional connectivity across multiple scales of resolution. Theoretical and computational analyses of small artificial networks provide insight into the limitations of different architectural motifs in facilitating the performance of low-level functions, while data-driven analyses of large-scale human brain networks provide insight the overlap of high-level functions supported by a common structural architecture.

Computational neural networks provide a powerful framework in which to systematically probe the dependence of system function on underlying architecture. We use neural networks to assess the dependence of competitive learning and memory processes on structural variations. By comparing the performance of parallel and layered network architectures during sequential tasks that require both acquisition and retention of information, we identify tradeoffs between learning and memory processes that arise from variations in underlying structure. During the task of supervised, sequential function approximation, networks produce and adapt representations of external information. Performance is evaluated by statistically analyzing the error in these representations while varying the initial network state, the structure of the external information, and the time given to learn the information.

The structure of the underlying error landscape connects functional network performance to complexity in network architecture. We employ sloppy model analysis {Brown2003} of parallel and layered network landscape minima to isolate variations in the number, curvature, and eigenvector localization properties of local minima within different network landscapes. We find that these variations in landscape structure give rise to tradeoffs in performance; these include the ability of the network to maximize accuracy versus minimize inaccuracy and produce specific versus generalizable representations of information. Parallel networks generate smooth error landscapes with deep, narrow minima, enabling them to find highly specific representations given sufficient time. While accurate, however, these representations are difficult to generalize. In contrast, layered networks generate rough error landscapes with a variety of local minima, allowing them to quickly find coarse representations.

Although less accurate, these representations are easily adaptable. Importantly, the systematic analysis of performance in small neural network models provides insight into the performance of larger composite systems in which statistical analyses of performance would be intractable. Furthermore, the finding that variations in parallel versus layered network architectures give rise to measurable performance tradeoffs has implications for understanding the behavior of a wide variety of natural and artificial learning systems that share these structural features.

Given that the statistical analysis of computational network performance can inform large-scale models of neural systems, we similarly ask to what extent the data-driven study of the brain can in turn inform computational models of network function. Magnetic resonance imaging enables the noninvasive mapping of both anatomical white matter connectivity and dynamic patterns of neural activity in the human brain. We examine the relationship between the structural properties of white matter tracts (structural connectivity) and the functional properties of correlations in neural activity (functional connectivity) within 84 healthy human subjects both at rest and during the performance of attention- and memory-demanding tasks. We show that structural properties, including the length, number, and spatial location of white matter tracts, are predictive of and can be inferred from the strength of resting-state and task-based functional interaction between brain regions. Importantly, we show that these relationships are both representative of the entire set of subjects and consistently observed within individual subjects. The observed links between structural and functional pathways in the human brain provide insight into the development of large-scale neural architecture and the functional implications of disruptions to this architecture.

This direct analysis of connectivity uncovers relationships between structural and functional interactions without invoking knowledge of the specific brain regions involved in these interactions. By incorporating this additional knowledge into our analysis, we identify structurally-mediated interactions between putative task-related functional networks that both support and distinguish between cognitive states. We build upon previous studies that have identified resting-state functional networks, denoted task-positive (TP) and task-negative (TN) networks, that are strongly anticorrelated at rest but also involve regions of the brain that routinely increase and decrease in activity during attention processes, suggesting that task-based function is encoded in resting-state activity. By identifying regions within our brain networks that have been implicated in TP and TN networks, we investigate the structural mechanisms that support a functional overlap between resting-state and task-driven activity. We show that strong interactions within and between TP and TN networks, as quantified by an increase in the relative number of anatomical connections that support strong functional correlations, distinguish resting-state, attention-state, and memory-state brain activity. We map differences in these interactions to a phase-like space in which brain states are characterized by the relative contribution from different network interactions. We probe the features of this phase space across subjects and find sets of ordered relationships between cognitive states. This order enables us to group individual based on their similarity in phase space, and we link groupings with abnormal phase relationships to significant deviations in behavioral performance during attention and memory tasks. This suggests that further characterization of this phase space may help identify structural and functional signatures of altered cognitive states.

Together, these findings uncover robust links between structural architecture and functional activity in small-scale artificial network models and in large-scale human brain networks, which together inform and constrain intermediate-level descriptions of structural and functional interactions. The development of integrative, multiscale descriptions of neural system architecture is crucial for understanding both the capabilities and the constraints imposed by this architecture.

Physical Description:
1 online resource (261 pages)
Format:
Text
Collection(s):
UCSB electronic theses and dissertations
ARK:
ark:/48907/f39884zk
ISBN:
9781267767516
Catalog System Number:
990039147500203776
Rights:
Inc.icon only.dark In Copyright
Copyright Holder:
Ann Hermundstad
Access: This item is restricted to on-campus access only. Please check our FAQs or contact UCSB Library staff if you need additional assistance.