Daily Bulletin Archive

Feb. 28, 2012

CISL daily backs up the files in your personal /glade/home file space, but that is not the case in other GLADE directories you might be using.

For details on our backup and purge policies, please review our newly revised GLADE File System documentation. Be sure to take a look at our new Transferring files documentation, too.

Let us know what you think. We look forward to hearing from you.

Feb. 24, 2012

HPSS archival jobs initially submitted to the archive and share queues, then diverted to the HOLD queue because of project overrun, have been failing due to incorrect queue redirection. The expected net effect of  this change on Monday, February 13, is that HPSS archive jobs will run on the share or archive queue to which they were initially submitted, and users should see expedition of these jobs.

Feb. 24, 2012


We are announcing the following opportunity for graduate students and postdoctoral researchers in the atmospheric sciences, applied mathematics or related fields:

Dynamical Core Model Intercomparison Project (DCMIP):

DCMIP Summer School on Future-Generation Non-hydrostatic Weather and Climate Models

Dates: July 30 - August 10,2012


The application period will run from mid February to the end of March, 2012.


This multidisciplinary two-week summer school and model intercomparison project will be held at the National Center for Atmospheric Research from July 30 to August 10, 2012. This summer school brings together graduate students, postdocs, atmospheric modelers, expert lecturers and computer specialists to create a stimulating, unique and hands-on driven learning environment. It will lead to an unprecedented student-run international model intercomparison project, and thereby train the future generation of scientists engaged in global atmospheric model developments. Special attention is paid to the role of emerging non-hydrostatic global atmospheric models.


In particular we focus on atmospheric dynamical cores, which describe the fluid flow component of general circulation models. The summer school and model intercomparison project will `promote active learning, innovation, discovery, mentorship and the integration of science and education. We anticipate testing about 10-12 dynamical cores that represent a broad spectrum of the modeling approaches in the international weather and climate modeling community.

Hosted by: Computational & Information Systems Laboratory National Center for Atmospheric Research (NCAR), Boulder, CO

For more information please visit:

Feb. 24, 2012

How to transfer files to and from our HPC, analysis, and storage systems is now pulled together for easy reference in a new Transferring files documentation section.

In an ongoing effort to help you find the information you need more quickly, we also recently reorganized the documentation menu you will see on that page and many others.

Want to let us know what you think? Look for the "Feedback" link at the very bottom of that new menu.

We look forward to hearing from you.

Feb. 21, 2012

Consulting walk-in service will be closed during the Software Engineering workshop to be held next week February 21 through February 24 to allow consulting staff members to attend the workshop. Users may continue to file tickets using helpdesk channels which will be responded to as usual.

Feb. 20, 2012

The U.S. Global Change Research Program is soliciting proposals for the use of the Climate Simulation Laboratory (CSL) computing facilities, beginning August 1, 2012, and continuing through January 31, 2014. The DEADLINE for submitting proposals is February 20, 2012. If you are currently using the CSL and wish to continue using the facility after June 2012, please submit your renewal proposal and a progress report by the February 20, 2012, deadline.

The multi-agency CSL computing facility was established in 1995 and is dedicated to climate modeling in support of the U.S. Global Change Research Program. The CSL provides high-performance computing and data storage systems to support large-scale, long-running simulations of the Earth’s climate system.

This round of CSL allocations will have access to NCAR’s forthcoming petascale Yellowstone system, a 1.55-PFLOPs IBM iDataPlex cluster, as well as its associated storage, data analysis, and visualization resources. For more information about Yellowstone, see the CISL web site (http://www2.cisl.ucar.edu/). Additional information about use of the CSL and updated submission instructions can be found at the CSL web site, http://www2.cisl.ucar.edu/csl/.

If you have questions about this opportunity, please contact David Hart, User Services Manager, 303-497-1234, or email alloc@ucar.edu.  Please share this information with any other colleagues who might be interested. We look forward to hearing from you.

Feb. 10, 2012

Starting Thursday, February 2, Mirage nodes 3 and 4 will no longer be available for interactive use. The nodes will be placed under control of the LSF scheduler, to test and configure data analysis and visualization workflows in preparation for the forthcoming Yellowstone environment at NWSC.

The remaining Mirage nodes (0, 1, 2, and 5) will continue to be available for interactive use. These four interactive nodes all have 128 GB to maximize the memory available to Mirage users. CISL is using the Mirage nodes with smaller memory (64 GB) for LSF testing. Please arrange your workflow accordingly.

Contact the Consulting Services Group via cislhelp@ucar.edu if you have comments or questions about this change, or if you are interested in access to the Mirage batch nodes.

Feb. 6, 2012

Consulting Services Group will offer its week-long HPC workshops on February 6, 2012 – February 10, 2012. Topics include UNIX, Fortran, MPI/OpenMP, GPU, NCL/IDL, and others.

Please go to the following link for more details and early registration.

Feb. 6, 2012

For researchers interested in or planning to apply for any of the upcoming allocation opportunities available for NCAR's forthcoming Yellowstone environment, Dave Hart, NCAR's User Services manager, will host an online Q&A session February 3, 2012, 10am-12pm (Mountain), that will include a brief description of the Yellowstone system, tips for writing successful allocation requests, and an opportunity to ask questions.

To call into the meeting, please use the following:

Audio Dial-In Information:
U.S & Canada: 866.740.1260
Access Code: 4971234

To join the meeting online and see the slides:

The Yellowstone system, a 1.5-Pflops HPC system is scheduled to enter production in late summer 2012, and the deadlines to apply for allocations are fast approaching. This session is an opportunity for potential requesters to ask questions about the system, eligibility requirements and writing allocation requests.

Five different opportunities are now or will soon be accepting requests. Details on all of the following opportunities and deadlines are available at https://www2.cisl.ucar.edu/docs/allocations.

* February 20 -- Climate Simulation Laboratory
* March 5 -- Accelerated Scientific Discovery (University and NCAR)
* March 5 -- NCAR Strategic Capability projects
* March 26 -- University (Large requests)
* March 26 -- Wyoming-NCAR Alliance

Jan. 26, 2012

The Texas Advanced Computing Center will present the training session,

Linux/Unix Basics

January 26, 2012 (Thursday)

9 a.m. - Noon (CT)

J.J. Pickle Research Campus

Research Office Complex (ROC 1.603)

10100 Burnet Rd.

Austin, TX 78758

This foundational class provides beginners and intermediate users with basic Linux/Unix command line environment experience. The lecture will emphasize common strategies used for interacting with clusters and HPC resources. No prerequisite.

This class will be webcast.

If you would like to attend the course via webcast and you have an XSEDE User Portal login, please use the link below to register:


If you would like to participate via webcast, and do not have an XSEDE User Portal login, you will have an opportunity to create an account when you attempt to register at the above link.

Please submit any questions you may have via the Consulting section of the XSEDE User Portal. https://portal.xsede.org/help-desk