This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Ssecrett and NeuroTrace: Interactive Visualization and Analysis Tools for Large-Scale Neuroscience Data Sets
May/June 2010 (vol. 30 no. 3)
pp. 58-70
Won-Ki Jeong, Harvard University
Johanna Beyer, King Abdullah University of Science and Technology
Markus Hadwiger, King Abdullah University of Science and Technology
Rusty Blue, Kitware
Charles Law, Kitware
Amelio Vázquez-Reina, Tufts University
R. Clay Reid, Harvard Medical School
Jeff Lichtman, Harvard University
Hanspeter Pfister, Harvard University
Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system.

1. O. Sporns, G. Tononi, and R. Kötter, "The Human Connectome: A Structural Description of the Human Brain," PLoS Computational Biology, vol. 1, no. 4, 2005, doi:10.1371/journal.pcbi.0010042.
2. W.-K. Jeong et al., "Scalable and Interactive Segmentation and Visualization of Neural Processes in EM Datasets," IEEE Trans. Visualization and Computer Graphics, vol. 15, no. 6, 2009, pp. 1505–1514.
3. A. Vázquez-Reina, E. Miller, and H. Pfister, "Multiphase Geometric Couplings for the Segmentation of Neural Processes," Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR 09), IEEE CS Press, 2009, pp. 2020–2027.
4. J. Krüger and R. Westermann, "Acceleration Techniques for GPU-Based Volume Rendering," Proc. IEEE Visualization 03, IEEE CS Press, 2003, pp. 287–292.
5. D. Martin, C. Fowlkes, and J. Malik, "Learning to Detect Natural Image Boundaries Using Local Brightness, Color, and Texture Cues," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 1, 2004, pp. 530–549.
1. J.C. Fiala, "Reconstruct: A Free Editor for Serial Section Microscopy," J. Microscopy, vol. 218, no. 1, 2005, pp. 52–61.
2. J. Lu, J.C. Fiala, and J. W. Lichtman, "Semi-Automated Reconstruction of Neural Processes from Large Numbers of Fluorescence Images," PLoS ONE, vol. 4, no. 5, 2009, pp. e5655.
3. E. Jurrus et al., "Axon Tracking in Serial Block-Face Scanning Electron Microscopy," Medical Image Analysis, vol. 13, no. 1, 2009, pp. 180–188.
4. J.H. Macke et al., "Contour-Propagation Algorithms for Semi-automated Reconstruction of Neural Processes," J. Neuroscience Methods, vol. 167, no. 2, 2008, pp. 349–357.
5. E. LaMar, B. Hamann, and K. Joy, "Multiresolution Techniques for Interactive Texture-Based Volume Visualization," Proc. IEEE Visualization 99, ACM Press, 1999, pp. 355–362.
6. E. Gobbetti, F. Marton, and J. Guitan, "A Single-Pass GPU Ray Casting Framework for Interactive Out-of-Core Rendering of Massive Volumetric Datasets," Visual Computer, vol. 24, nos. 7–9, 2008, pp. 797–806.
7. C. Müller, M. Strengert, and T. Ertl, "Optimized Volume Raycasting for Graphics-Hardware-Based Cluster Systems," Eurographics Symp. Parallel Graphics and Visualization (EGPGV 06), Eurographics Assoc., 2006, pp. 59–66.

Index Terms:
neuroscience, connectome, segmentation, volume rendering, implicit surface rendering, graphics hardware, computer graphics, graphics and multimedia
Citation:
Won-Ki Jeong, Johanna Beyer, Markus Hadwiger, Rusty Blue, Charles Law, Amelio Vázquez-Reina, R. Clay Reid, Jeff Lichtman, Hanspeter Pfister, "Ssecrett and NeuroTrace: Interactive Visualization and Analysis Tools for Large-Scale Neuroscience Data Sets," IEEE Computer Graphics and Applications, vol. 30, no. 3, pp. 58-70, May-June 2010, doi:10.1109/MCG.2010.56
Usage of this product signifies your acceptance of the Terms of Use.