Use SAM to monitor projects, set Yellowstone preferences

February 1, 2013

CISL has deployed the new System Accounting Manager (SAM) application for managing allocations, user preferences, and accounting for use of resources in the Yellowstone environment.

You can visit SAM at http://sam.ucar.edu/ and log in with your UCAR token or password.

Select User -> Preferences to view and set your primary group across CISL resources, your default project for HPSS, and your preferred login shell. These changes typically propagate to the systems the next morning.

Select Reports -> Account Statement to view project information, allocations, and resource activity, including job-level details. Yellowstone and Janus use is reported daily; weekly HPSS usage will begin appearing soon.

SAM is a work in progress. CISL is actively working on improving the user interface and expanding the reporting features, including 30/90 day reporting. If you encounter any errors or if you have any feedback and feature requests, please send them to cislhelp@ucar.edu.

REMINDER: Bluefire /ptmp read-only as of January 14

January 25, 2013

After a short outage on Monday morning, Bluefire will be returned to service with the /ptmp file system in read-only mode.

Users should use the rest of January to migrate any files they wish to keep from /ptmp to another location. When Bluefire is shut down at the end of the month, the /ptmp file system will no longer be available.

Please see the following item for more details of the Bluefire transition.

Scheduled Maintenance for the Week of February 25 - March 1

February 28, 2013

HPSS downtime Wednesday, February 27 from 7:00am - 09:00am

No Scheduled Downtime: Yellowstone, Geyser_Caldera, NWSC GLADE,  DAV, ML GLADE, Lynx

Bluefire, Mirage/Storm and Mesa Lab GLADE decommissioning schedule

January 25, 2013

With the Yellowstone system now in full production and available to all users, users of the Bluefire, Mirage/Storm, and GLADE disk system in the Mesa Lab should keep the following decommissioning schedule in mind and plan their work accordingly.

Please see http://www2.cisl.ucar.edu/resources/yellowstone/transition for information about making the switch from Bluefire and the various ways to migrate files to the Yellowstone GLADE file spaces.

Please contact cislhelp@ucar.edu if you have questions about the transition, the schedule, or the new systems at NWSC.

BLUEFIRE

Bluefire's last day of service will be January 31, 2013. Users should plan their work accordingly. For making the switch to Yellowstone, CISL has provided Yellowstone transition documentation at https://www2.cisl.ucar.edu/resources/yellowstone/transition.

IMPORTANT: Bluefire's /ptmp file system will only be available through January 31, 2013, when Bluefire is shut down. Users with data in /ptmp must be sure to move files before the end of January. The /ptmp will be made read-only on January 14 to prevent new data from being stored there.

MIRAGE & STORM

The Mirage and Storm data analysis and visualization clusters will be retired on February 28, 2013. The Mirage and Storm clusters will be available for users to wrap up their final analysis tasks in February and to migrate data to the Yellowstone environment through March.

GLADE FILE SPACES at Mesa Lab

The GLADE file spaces mounted on Bluefire, Lynx, and Mirage/Storm will be accessible until March 31, 2013. Users are responsible for ensuring that files are migrated to new locations before the end of this period.

During February 2013, Bluefire's GLADE file systems will be mounted on the Mirage nodes with read/write access to permit users to complete some final analysis tasks. On March 1, 2013, the file systems will be set to read-only mode, to prevent new data from being generated. The Mirage nodes will be available to allow users to migrate data.

* /glade/home :: Home directories will be available until March 31, 2013. After that point, CISL will retain an off-line copy of the data for six months, during which period users will need to contact cislhelp@ucar.edu to access their files.

* /glade/scratch :: The current 30-day retention policy will remain in effect. Files will continue to be removed 30 days after last access.

* /glade/user, /glade/proj2, /glade/proj3 :: Files in these spaces will be available until March 31, 2013. Users are responsible for migrating files they wish to retain.

On April 1, CISL will remove all remaining data in /glade/scratch, /glade/user, /glade/proj2, /glade/proj3. The bfft.ucar.edu service for transferring files will be turned off at the same time.

Research Data Archive migrating to new GLADE

December 18, 2012

The Research Data Archive (RDA) is being copied to the new GLADE system in a process that will take several days. Until that process is complete, please continue to use Bluefire or Mirage to get the data you need from /glade/data02/dsszone as documented here: Research Data Archive.

You can then copy (cp) the files you need to your new GLADE directory for use in the Yellowstone environment.

We will make another announcement when the RDA is publicly accessible directly from the new GLADE system.

Bluefire decommissioning now set for January 31, 2013

December 18, 2012

Based on current Yellowstone successes, the current timeline for Bluefire will have that system being decommissioned on January 31, 2013. Users should plan their work accordingly.

CISL and IBM continue to make progress on Yellowstone, and the agreement with IBM is to provide maintenance for Bluefire for six weeks after Yellowstone is declared production-ready.

Most Bluefire users should now have access to Yellowstone and are strongly encouraged to begin ramping up on that system, while they focus on bringing their work on Bluefire to a meaningful stopping point.

Scheduled Maintenance for the Week of December 24 - December 28

December 27, 2012

No Scheduled Downtime: Yellowstone, Geyser_Caldera, NWSC GLADE, HPSS, Bluefire, DAV, ML GLADE, Firefly, Lynx

Scheduled Maintenance for the Week of October 22 - October 26

October 29, 2012

Yellowstone, NWSC GLADE, Geyser/Caldera: Downtime extended through Friday p.m. (estimated)

No Scheduled Downtime: Bluefire, HPSS, ML GLADE, Firefly, Lynx, DAV

Reminder - XSEDE/TACC Training - HPC Python Tutorial October 15

October 15, 2012

HPC Python Tutorial
 
October 15, 2012
9 a.m. to 4 p.m. (CT)
Texas Advanced Computing Center
J.J. Pickle Research Campus
ROC 1.900
10100 Burnet Rd.
Austin, TX 78758

This class will be webcast.

Registration will close on Friday, October 12, at 1 pm (CT).

As HPC widens its vision to include big data and non-traditional applications, it must also embrace languages that are easier for the novice, more robust for general computing, and more productive for the expert. One candidate language is Python.  Python is a versatile language with tools as diverse as visualizing large amounts of data, creating innovative user interfaces, and running large distributed jobs. Unfortunately, Python has a reputation for poor performance. In this tutorial, we give a user practical experience using Python for scientific computing tasks. Topics include array computing with NumPy, interactive development with IPython, low-level C linking with Cython, distributed computing with MPI, and performance issues.

The tutorial will feature guest speaker Dr. Travis Oliphant, the author of NumPy and SciPy.  Dr. Oliphant will discuss the use of array computing in Python and his latest creation, Numba, a just-in-time compiler for NumPy.

Recommended prerequisites:

Basic programming knowledge with Python. A good tutorial is available online here:

http://docs.python.org/tutorial/

Remote attendees can install the used libraries versions (all included in Anaconda Pro [0]  or Enthought Python Distribution [1] (which doesn't include mpi4py):

Python 2.7
numpy 1.6
scipy 0.10
IPython 0.12
cython 0.15
mpi4py 1.2.2

[0] https://store.continuum.io/cshop/anaconda
[1] http://enthought.com/products/epd.php

Staff support for remote users will be limited; however, the lecturers will field questions.

Registration

Please submit any questions that you may have via the TACC Consulting System.
http://portal.tacc.utexas.edu/consulting/

Pages

Subscribe to Computational & Information Systems Laboratory RSS