The Mitchell Institute for Fundamental Physics and Astronomy at Texas A&M University uses HPRC machines for its big computing work. It provides computing capabilities for the Astronomy, CDMS, CMS, Pheno and other Physics areas.
A. Basic Steps
B. Other things you might want to do using the cluster
Once you are able to run jobs and you are done with this ‘Getting Started Guide’, you might want to check the Links and Information for Users site. You will find a compact view of the information you may need in the future.
Getting the accounts you need
Overview
CDMS Users often need accounts on a number of sites. We list each of the potential sites, and why you would need each.
Note: You may not need all these accounts.
Account | Use | |
---|---|---|
Confluence | Provides access to Confluence—which has CDMS’ primary documentation | |
SLUO and SLAC Computing | Provide Access to SLAC and SLUO Computing Accounts | |
HPRC | TAMU computing cluster, where you’ll do most of your work | |
GitLab | Provides access to CDMS code repositories | |
Globus | Big data transfers |
Confluence Account, and SLUO/SLAC Computing Account
Naturally you need access to the CDMS documentation and communication media. For such purpose you must obtain a SLAC/SLUO (SLAC Users Organization) account.
SLUO account is the administrative layer through which non-SLAC employees get SLAC accounts. And SLAC account is also used for Confluence and JIRA access.
To apply for the accounts, follow the instructions on:
How to register as a SLAC user and get a Computer account
This link will cover:
- How to obtain your SLAC ID number (SLUO account)
- How to obtain your SLAC Computing Account
- How to send forms to the CDMS Computer: Concetta Cartaro (Tina) to create such accounts
- How to get a Confluence Account (You need to request specific access to CDMS Confluence pages)
For the form you are directed to, you should know:
- Our funding is from the DOE
- We will not (generally) be physically working at SLAC, but we will be using SLAC computing resources.
- Under “Current Institution” on the second page of the application form, for “Department” put “Astronomy” (this could change)
- For “SLAC Group/Department,” our group is called “PAC Cryo Dark Matter Search”
- For the ‘Ending Date’ in ‘SLAC Information’ put in a date six years from now (assuming your goal is a PhD)
- For “PHD Thesis on SLAC research?” put ‘yes’ (though group members have put ‘no’ before).
- For “SLAC Spokesperson/Sponsor/Supervisor who will confirm your information” enter ‘Richard Partridge’.
If you only need to access Confluence or Jira you can only create a SLAC Crowd Account and ignore creating a SLAC Computing Account. Please be careful that you still need to be a CDMS Collaboration member. Note that you will need to complete some Training Courses, using your ID number and a password that you must request to esh-training. Instructions for this will be provided in the account request process.
Finally, you need access to certain CDMS documentation websites which are private, restricted to CDMS users only. Contact your CDMS group leader to obtain access.
Examples of such sites are:
GitLab Account
If you will be doing pretty much anything for CDMS you will need access to the CDMS GitLab, where we store the code there.
First, You need to create an account in GitLab. Then follow the instruction here to request access to the SuperCDMS group.
You can use SSH keys to communicate with GitLab. Here is the link for the instruction.
Getting started with the main software packages
Other than the tools mentioned at our HPRC New Users and Getting Started Guide there are some CDMS specific tools you may need. We mentioned them below.
CDMS Software Packages
There are several tools you may use that are developed by/for the CDMS experiment. The general frame-work is sourcing (from a GIT repository), and then running your own code.
- matCAP: for analyzing processed data in MATLAB.
- cdmsBats: for processing raw data or Monte Carlo pulses.
- cdmsTools: for analyzing processed data in MATLAB.
- Simulations: Simulating events and running simulations .
Running and Submitting Jobs
Terra and Grace both use SLURM as their batch system. You can use the following link for more information about the batch system: