Scheduled Maintenance for the Week of May 28 - 31

May 29, 2013

No Scheduled Downtime: Yellowstone, Geyser, Caldera, HPSS, GLADE, Lynx

Yellowstone ready and waiting for user jobs

April 25, 2013

For the past two days, Yellowstone utilization has been hovering around 50%. The system is ready and waiting for user jobs.

If you are encountering any issues with getting started on Yellowstone or having Yellowstone complete your jobs, please contact cislhelp@ucar.edu.

Yellowstone has room for more jobs at night and over the weekends

April 12, 2013

While Yellowstone has regularly been averaging 90% utilization on weekdays, it has room to handle more jobs. In particular, overnight and on weekends, Yellowstone has been emptying its queues and showing lower utilization in the early morning hours and on Sundays and early Mondays.

Users are encouraged to factor this usage pattern and submit jobs late in the day or late on Friday. Such jobs are likely to be run overnight or over the weekend. Note that these times might be good opportunities to take advantage of the economy queue.

Mesa Lab GLADE and Mirage cluster retired, new scratch space renamed

April 11, 2013

The Mirage cluster and the GLADE resource at the Mesa Lab were taken down and retired on April 1, as announced previously. Files that remained in the Mesa Lab GLADE scratch and project spaces were removed and are no longer available.

Files that were in /glade/home were backed up and are being retained off-line for six (6) months. Users who need to access those files should contact cislhelp@ucar.edu.

During the transition period, the new scratch space was identified as /glade/nwsc_scratch; it is now just /glade/scratch. If you still have “/glade/nwsc_scratch” in a script, you will need to change it to “/glade/scratch” to use the new space.

For more information regarding the new GLADE file spaces for Yellowstone users, see http://www2.cisl.ucar.edu/resources/glade.

Scheduled Maintenance for the Week of April 22 - 26

April 25, 2013

No Scheduled Downtime: Yellowstone, Geyser_Caldera, HPSS, GLADE, Lynx

Two-day maintenance downtime on Yellowstone, March 18-19

March 21, 2013

Yellowstone, Geyser, and Caldera along with the GLADE storage system will be taken down starting Monday, March 18, at 7 a.m. (Mountain Time). The plan is to return the cluster for production work by Wednesday, March 20, at 8 a.m. CISL will begin draining the queues during the coming weekend to ensure no running jobs will be killed.

This maintenance work includes a number of software and firmware updates, but the major effort will be focused on replacing about 233 optical InfiniBand cables. We anticipate being able to return a healthier Yellowstone system to users after completing this work.

While GLADE is undergoing maintenance work, users will not be able to copy files from the old GLADE system via Mirage to the new GLADE.

We apologize for the short notice of this long downtime. We will provide updates as soon as possible if the timetable changes.

Scheduled Maintenance for the Week of March 25 - 29

March 28, 2013

No Scheduled Downtime: Yellowstone, Geyser_Caldera, HPSS, NWSC GLADE, DAV, ML GLADE, Lynx

ASD Experiences with Yellowstone: Seminar March 14, 2 - 3:30 p.m.

March 13, 2013

Please join us March 14, 2 p.m. - 3:30 p.m. in the NCAR Mesa Lab Main Seminar Room for presentations from three of the Accelerated Scientific Discovery (ASD) projects that have been among the first to put CISL's new 1.5-petaflops Yellowstone system through its paces.

Each of the speakers will discuss the team's experiences in using Yellowstone, where the system performed well (or not), and when available, some early results from their runs.

The presentation will be webcast (URL tbd) for those unable to attend in person.

The following speakers will present during the session:

Gabriele Pfister, ACD -- Prediction of North American air quality

High-resolution simulations with the nested regional climate model with chemistry (NRCM-Chem) are being performed to study possible changes in weather and air quality over North America between present-day and two future time periods: 2020-2030 and 2045-2055. The performed simulations are targeting insights into expected future changes related to air quality and will also be used for dynamical downscaling (of meteorology and air quality) of global climate simulations performed at NCAR in support of the 2013 IPCC AR5. The project is using NRCM-Chem over the conterminous U.S. at a resolution of 12 x 12 km2 with inclusion of atmospheric chemistry to examine impacts on air quality and other regional climate processes.

Justin Small, CGD -- Meso- to planetary-scale processes in a global ultra-high-resolution climate model

In this project, the main computational objective is to perform and assess high-resolution Community Earth System Model (CESM) simulations in order (1) to investigate the climate response to the coupling of ocean and atmosphere mesoscale features, (2) to assess the ability of a high-resolution and frequently coupled (two-hour) ocean and atmosphere simulation to represent near-inertial waves in the ocean, and (3) to investigate the role of small-scale ice features such as polynyas in the climate system.

David Richter, MMM -- Turbulence modification in the spray-laden atmospheric marine boundary layer

This project is focused on the effect that sea spray, suspended by turbulence in the high-wind marine atmospheric boundary layer, has on both the turbulence itself as well as the transfer of momentum and heat to the ocean surface. To study this problem, a fundamental approach is taken where direct numerical simulation (DNS) coupled with Lagrangian particle-tracking is employed to focus generally on how a dispersed phase (such as sea spray) modifies turbulence. Systematic runs of turbulent Couette flow at multiple Reynolds numbers, each with various particle sizes, time scales, and concentrations, are being performed to identify the critical mechanisms of turbulence modification, as well as to determine the extent to which suspended particles can affect transport of heat and momentum.

Bluefire, Mirage/Storm files become read-only soon, Lynx home directory changing

March 4, 2013

The old GLADE file spaces mounted on Mirage/Storm with read/write access will be set to read-only mode on Monday, March 4, to prevent new data from being generated.

The file spaces will remain available read-only via the Mirage nodes through March 31. Users are responsible for moving files to new locations in the meantime.

The Storm cluster will be retired March 4.

Note these details regarding the various file spaces:

* /glade/home :: CISL will retain an off-line copy of the data in home directories for six months after March 31, during which period users will need to contact cislhelp@ucar.edu to access their files.

* /glade/scratch :: The current 30-day retention policy remain in effect. Files will continue to be removed 30 days after last access.

* /glade/user, /glade/proj2, /glade/proj3 :: Files in these spaces will be available until March 31. Users are responsible for migrating files they wish to retain.

On April 1, CISL will remove all remaining data in /glade/scratch, /glade/user, /glade/proj2, /glade/proj3. The bfft.ucar.edu service for transferring files will be turned off at the same time.

For information about how to migrate files to the Yellowstone GLADE file spaces, please see http://www2.cisl.ucar.edu/resources/yellowstone/transition.

Lynx users: Your home directory for Lynx will change on March 4. You are responsible for copying necessary files from your GLADE home directory to your new Lynx home directory. 

If you have questions about the transition, the schedule, or the new systems at NWSC, please contact cislhelp@ucar.edu.

Yellowstone usage charges now reported in SAM

February 22, 2013

A reminder to all Yellowstone users that you can monitor your project activity and usage via the new SAM system at http://sam.ucar.edu/. As of Monday, February 11, jobs associated with overspent projects will be rejected by the LSF scheduler on Yellowstone.

All Yellowstone jobs associated with each project are visible via the SAM interface, and all jobs since December 1, 2012, have been charged.

The SAM system is under active development. To date, we have focused on SAM features that ensure users have proper access to the Yellowstone environment. Current efforts include expanding the reporting capabilities available to users and NCAR divisions.

With the end of Bluefire, we have also decommissioned the legacy ACC8 accounting system, so those email reports are no longer being sent.

Questions about using SAM or requests for SAM reports and features can be sent to cislhelp@ucar.edu.

Pages

Subscribe to Computational & Information Systems Laboratory RSS