Detection and skeletonization of single neurons and tracer injections using topological methods
Dingkang WangLucas MageeBing‐Xing HuoSamik BanerjeeXu LiJaikishan JayakumarMeng Kuan LinKeerthi RamSuyi WangYusu WangPartha P. Mitra
3
Citation
74
Reference
10
Related Paper
Citation Trend
Abstract:
Neuroscientific data analysis has traditionally relied on linear algebra and stochastic process theory. However, the tree-like shapes of neurons cannot be described easily as points in a vector space (the subtraction of two neuronal shapes is not a meaningful operation), and methods from computational topology are better suited to their analysis. Here we introduce methods from Discrete Morse (DM) Theory to extract the tree-skeletons of individual neurons from volumetric brain image data, and to summarize collections of neurons labelled by tracer injections. Since individual neurons are topologically trees, it is sensible to summarize the collection of neurons using a consensus tree-shape that provides a richer information summary than the traditional regional ‘connectivity matrix’ approach. The conceptually elegant DM approach lacks hand-tuned parameters and captures global properties of the data as opposed to previous approaches which are inherently local. For individual skeletonization of sparsely labelled neurons we obtain substantial performance gains over state-of-the-art non-topological methods (over 10% improvements in precision and faster proofreading). The consensus-tree summary of tracer injections incorporates the regional connectivity matrix information, but in addition captures the collective collateral branching patterns of the set of neurons connected to the injection site, and provides a bridge between single-neuron morphology and tracer-injection data.Keywords:
Skeletonization
Tree (set theory)
Social Connectedness
In the clinical practice of diagnosis and treatment of liver disease, how to effectively represent and analyze the vascular structure has been a widely studied topic for a long time. In this paper, we propose a method for the three dimensional skeletal graph generation of liver vessels using 3D thinning algorithm and graph theory. First of all, the principal methods for skeletonization are introduced, followed by their comparative analysis. Secondly, the 3D thinning-based skeletonization method together with a filling hole pre-processing on liver vessel image are employed to form the liver skeleton. A graph-based technique is then employed on the skeleton image to efficiently form the liver vascular graph. The thinning-based liver vessel skeletonization method was evaluated on liver vessel images with other two kinds of skeletonization approaches to show its effectiveness and efficiency.
Skeletonization
Thinning
Vascular network
Cite
Citations (7)
Skeletonization as a tool for quantitative analysis of three- dimensional (3D) images is becoming more important, as they are more common in a number of application fields, especially in biomedical tomographic images at different scales. Here we propose a method, which computes both surface and curve skeletons of 3D binary images. The distance transform algorithm is applied to reduce a 3D object to a 2D surface skeleton, an then to a 1D curve skeleton in two phases. In surface skeletonization, 6-connectivity is used in distance transform; while in curve skeletonization, 18-connectivity is used in computing distance transform. Some examples are discussed to illustrate the algorithm.
Skeletonization
Distance transform
Medial axis
Skeleton (computer programming)
Cite
Citations (1)
Skeletonization is a crucial step in many digital image processing applications like medical imaging, pattern recognition, fingerprint classification etc. The skeleton expresses the structural connectivities of the main component of an object and is one pixel in width. Present paper covers the aspects of pixel deletion criteria in the skeletonization algorithms needed to preserve the connectivity, topology, sensitivity of the binary images. Performance of different skeletonization algorithms can be measured in terms of different parameters such as thinning rate, number of connected components, execution time etc. Present paper focuses on thinning rate, number of connected components, execution time on Zhang and Suen algorithm and Guo and Hall algorithm.
Skeletonization
Connected component
Connected-component labeling
Component (thermodynamics)
Medial axis
Cite
Citations (2)
Skeletonization
Cite
Citations (7)
The aim of this paper is to investigate the main existing 3D skeletonization methods, their properties and particularities with respect to centerline extraction of segmented blood vessels in human brain MRI images. Skeletonization of 3D objects may be performed using different approaches. Basically there are four main concepts of skeletonization algorithms which may be adopted in automatic analysis. In the article we use four algorithms representing each concept and obtain results in the cases of modeled and real vessels extracted from MRI volume. We investigate the main properties of obtained skeletons and formulate essential and preferable properties of 3D skeleton curves. According to visual analysis the most suitable method was selected and motivated.
Skeletonization
Cite
Citations (5)
The skeletonization method has been widely used in image analysis and other areas.Using skeleton to express image can preserve the topological structure and reduce redundant information.In the clinical practice of diagnosis and treatment of lung disease,how to effectively represent and analyze the vascular structure is very important for computeraided diagnosis and surgery.In this paper,a vessel skeletonization method for lung CT image was proposed.Firstly,vessels were segmented from the thoracic tissues by region growing algorithm.Secondly,the morphological operators were used to deal with the vessel segmentation results.Finally,skeleton of blood vessels was obtained by three-dimentional thinning algorithm.Experimental results show that the proposed method can accurately and efficiently extract vessel skeletons from lung CT image.
Skeletonization
Cite
Citations (0)
Skeleton representation of an object is believed to be a powerful representation that captures both boundary and region information of the object. The skeleton of a shape is a representation composed of idealized thin lines that preserve the connectivity or topology of the original shape. Although the literature contains a large number of skeletonization algorithms, many open problems remain. A new skeletonization approach that relies on the electrostatic field theory (EFT) is proposed. Many problems associated with existing skeletonization algorithms are solved using the proposed approach. In particular, connectivity, thinness, and other desirable features of a skeleton are guaranteed. Furthermore, the electrostatic field-based approach captures notions of corner detection, multiple scale, thinning, and skeletonization all within one unified framework. Experimental results are very encouraging and are used to illustrate the potential of the proposed approach.< >
Skeletonization
Representation
Skeleton (computer programming)
Medial axis
Cite
Citations (18)
Skeletonization is a widely used tool for running simplified simulations of water distribution systems without losing accuracy of model results. The level of skeletonization highly depends on the model purpose and is rarely investigated on a general level. The aim of this paper is to highlight the influence of different network properties on the level of skeletonization as well as to investigate a general context between network properties and level of skeletonization - taking into account different model purposes. Therefore, 300 virtual water distribution systems with varying network properties were generated, which then were skeletonized to different levels. This allows for a generic analysis. Simulation results with the skeletonized models were compared to their original models with the Nash-Sutcliffe coefficient and a percentage criterion. Results indicate that for example network size has an influence on the accuracy of results of skeletonized models in terms of water quality.
Skeletonization
Cite
Citations (8)
Skeletonization
Pruning
Representation
Cite
Citations (22)
The problem of 2-D skeletonization has been discussed in this paper. There are different categories of skeletonization methods: one is based on symmetric axis analysis,and the others are based on thinning and shape decomposition. The basic ideas and developments about these skeletonization methods are investigated. The main target for this article is to provide a systemic and clear reference for researchers on the patteren recognition,visualization and medical image processing etc.
Skeletonization
Cite
Citations (2)