Entries with tag university of tennessee.

New Supercomputing Benchmark Planned

The benchmark used to test and rank supercomputer performance is outdated, prompting the creation of a new metric that will be used starting in November. Jack Dongarra, distinguished professor of computer science at the University of Tennessee and one of the compilers of the Top500 list of the world’s fastest supercomputers, says the Linpack benchmark, used for the past 20 years, is no longer a useful measurement of application performance. It measures linear equation calculations, while calculations with more complexity are now common. The latest iteration of the test was developed in 2008. It also means that those vendors constructing systems are building them to perform well on an outdated test rather than to perform current applications well. The new test, the High Performance Conjugate Gradient, uses complex calculations found in contemporary applications that require high bandwidth and low latency and that access data using irregular patterns. Dongarra developed the new test with Michael Heroux from Sandia National Laboratories at the request of the US Department of Energy, which was concerned about applying Linpack to exascale computer systems. The new test will be gradually adopted and will initially be used with Linpack. The new test will be introduced at the SC13 supercomputing conference in Denver this November, which is also when the next The Top500 list will be released. Tianhe-2, China’s National University of Defense Technology supercomputer, is currently the top-ranked system. (Computerworld)(ZDNet)(Inside HPC)(“Toward a New Metric for Ranking High Performance Computing Systems,” @ Sandia National Laboratories)
 

Supercomputer Calculates Possible Isotope Combinations

Oak Ridge National Laboratory and University of Tennessee researchers used the Jaguar supercomputer to calculate the number of isotopes that the laws of physics allow. They used six nuclear-interaction models and found about 7,000 possible combinations. Of these 7,000, scientists have observed or produced about 3,000. The other combinations are created in massive stars or in violent stellar explosions, say the researchers. In their calculations, they quantified the so-called drip lines—the maximum number of neutrons and protons the laws of physics allow in a nucleus—that determine nuclear existence. The drip lines become uncertain among heavier elements. The calculations for each possible nuclei require about two hours of supercomputer processing time and include about 250,000 possible nuclear configurations. The researchers say they could not have done this work two or three years ago because they wouldn’t have had access to such a powerful supercomputer. The researchers say their work will create numerous scientific insights and someday could yield benefits such as cancer treatments that irradiate malignant cells without damaging healthy ones. They published their research in the journal Nature. (EurekaAlert)(Oak Ridge National Laboratory)
 

Showing 2 results.