Entries with tag supercomputing.

Cray Awarded $13 Million Supercomputing Contract

The PTC Center for High Performance Computing at the KTH Royal Institute of Technology in Stockholm, Sweden, has awarded Cray a $13 million contract to provide it with a next-generation Cray XC supercomputer system and services later this year. This will be one of the first petascale supercomputers in Scandinavia. The KTH institute plans to use the new system for simulations in areas such as fluid dynamics, climate modeling, neuroscience, and plasma physics. The institute will also make the supercomputer available to external researchers though the Swedish National Infrastructure for Computing and the Partnership for Advanced Computing in Europe. This is the latest contract of its type for Cray, which will also provide next-generation Cray XC machines to the Korea Meteorological Association and the US National Energy Research Scientific Computing Center. (GeekWire)(MarketWatch)

National Science Foundation Suspends Researcher for Misusing Supercomputing Resources for Bitcoin Mining

The US National Science Foundation (NSF) has suspended a researcher for misusing two universities’ supercomputers to mine bitcoins. Its Semiannual Report to Congress from the Office of the Inspector General<  states that the researcher mined roughly $8,000 to $10,000 worth of bitcoins via remote access using $150,000 worth of computer use on government equipment,. The universities involved stated the actions constituted unauthorized use of their networks. “Both university reports noted that the researcher accessed the computer systems remotely and may have taken steps to conceal his activities, including accessing one supercomputer through a mirror site in Europe,” noted the Inspector General’s report. The researcher claimed to have been conducting tests on the computers. The researcher is not allowed to use any NSF-funded supercomputer resources and, according to the Inspector General’s report, “In response to our recommendation, NSF suspended the researcher government-wide.” No additional details were released, including the universities involved or the researcher’s identity. “With the price of a bitcoin currently almost £400 ($679), there is certainly a strong temptation to misuse access to a powerful computer for mining,” Kadhim Shubber, a journalist with CoinDesk, a bitcoin news service, told the BBC. (SlashDot)(BBC)(CIO)(US National Science Foundation)

Supercomputer Models Aids Astronomers in Modeling Phenomena

Georgia Institute of Technology scientists are using various National Science Foundation-funded supercomputers for simulations, enabling them to better understand various astronomical phenomena. One of these is known as a tidal disruption, which happens when a star’s orbit moves too close to a black hole, causing it to be sucked into the black hole. The tidal disruption causes a bright flare that changes over time. Modeling the dynamics of various forces involved should help scientists better understand tidal disruptions as well as interactions occurring between stars and black holes. With computer simulations, they are able to look at sequences of events from various perspectives, repeating the process if needed. The researchers are using computing resources at the Texas Advanced Computing Center and the National Institute for Computational Sciences as well as at their home institution. Their work is already at the point where improved models are needed as the current research is reportedly outpacing the scientists’ current theoretical understanding of tidal disruptions and ensuring that such modeling will continue to inform their knowledge of these types of phenomena. (SlashDot)(National Science Foundation)

Expert: Exaflops Supercomputing Is Unlikely in the Near Future

The much-discussed idea that supercomputing performance could soon reach exaflops (1018 floating point operations per second) levels will not be possible before the end of the decade, according to Horst Simon, the US Lawrence Berkeley National Laboratory’s deputy director. A combination of technical challenges are proving an obstacle, including the total power needed by such a system, increased chip power efficiency, and the cost of data movement and memory. “I also think calling the system exa-anything is a bad idea. It’s become a bad brand, associated with buying big machines for a few national labs,” he told HPCWire. “It also sets the community up for a perceived failure if we don’t get to exaflops.” And measuring the system’s performance once it is built also poses a challenge, he adds, estimating an exascale system will need five to six days to run the LINPACK benchmark. A reasonable goal toward exascale computing, Simon said, would be constructing an exascale system that could rank first in the TOP500 supercomputing-performance list by 2020. He says there are projects working in that direction, including the US Department of Energy’s FastForward. Simon says the US needs exascale computing resources to maintain a competitive advantage in manufacturing as well as for national security. (SlashDot)(HPCWire)(Scientific Computing)(“No Exaflops for You,” Horst Simon)
 

Indiana University Unveils Supercomputer

The fastest supercomputer owned by an academic institution is now online at Indiana University. uses CPUs and GPUs, and operates at maximum of 1 petaflops. Academics will use IU’s Cray-based Big Red II system in the sciences, medicine, humanities, and fine arts. Indiana firm needing help with tasks such as advanced-product modeling will also be able to work with the machine. The computer has more than 21,000 CPU and GPU processing cores and will use a new high-speed, high-bandwidth disk-storage system. IU says it is an asset that should help attract and retain faculty, particularly those whose work requires advanced data-processing power. Officials say the computing power will, for example, let researchers complete a human-genome analysis—a task that typically takes six months—in eight days. Big Red II replaces the original Big Red, a 28-teraflops computer with 4,100 processing cores that became operational in 2006. (SlashDot)(Network World)(Indiana University)
 

New Supercomputing Record Set

Scientists at the Stanford University-based Center for Turbulence Research, which is operated by both the school and NASA, set a new supercomputing record by using a million processing cores to model supersonic jet noise. They used the US Lawrence Livermore National Laboratory’s IBM Sequoia Blue Gene/Q system to solve the complex fluid-dynamics problem. Their work could not only help develop quieter aircraft engines, but also proves that million-core simulations are possible. (EurekAlert)(Stanford University) 

Showing 6 results.