<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>HPCC – Overview</title><link>https://hpcc.ucr.edu/about/overview/</link><description>Recent content in Overview on HPCC</description><generator>Hugo -- gohugo.io</generator><atom:link href="https://hpcc.ucr.edu/about/overview/index.xml" rel="self" type="application/rss+xml"/><item><title>About: Welcome to the HPC Center (HPCC)</title><link>https://hpcc.ucr.edu/about/overview/introduction/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://hpcc.ucr.edu/about/overview/introduction/</guid><description>
&lt;p>&lt;img align="right" title="hpclogo" src="../../img/background_small.jpg">&lt;img/>&lt;/p>
&lt;h2 id="overview">Overview&lt;/h2>
&lt;p>The High-Performance Computing Center (HPCC) provides state-of-the-art research
computing infrastructure and training accessible to all UCR researchers and
affiliates at low cost. Currently, it supports over 150 research groups with
more than 800 active users. Its resources are also heavily used for instructing
undergraduate and graduate classes in a wide range of computational,
statistical, life science and engineering disciplines.&lt;/p>
&lt;div class="alert alert-primary" role="alert">
&lt;h4 class="alert-heading">News&lt;/h4>
This year the HPCC was awarded an MRI equipment grant (#2215705) by NSF for the acquisition of a Big Data HPC Cluster in the total amount of $942,829. For details see &lt;a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=2215705&amp;HistoricalAwards=false">here&lt;/a>.&lt;/li>
&lt;/div>
&lt;h3 id="quick-start">Quick start&lt;/h3>
&lt;p>The following lists the most frequently visted pages of the HPCC site. They can also be accessed via the navigation system outlined below.&lt;/p>
&lt;h4 id="navigating-and-searching-this-site">Navigating and searching this site&lt;/h4>
&lt;ul>
&lt;li>&lt;strong>Top menu&lt;/strong> located in bar on top of each page provides links to the main content categories&lt;/li>
&lt;li>&lt;strong>Section menu&lt;/strong> to the left links to subpages of each main category&lt;/li>
&lt;li>&lt;strong>Table of content&lt;/strong> to the right links to sections within each page&lt;/li>
&lt;/ul>
&lt;h4 id="gain-access">Gain access&lt;/h4>
&lt;ul>
&lt;li>&lt;a href="https://hpcc.ucr.edu/about/overview/access/">User account creation&lt;/a> for accessing HPCC&amp;rsquo;s infrastructure&lt;/li>
&lt;li>&lt;a href="https://hpcc.ucr.edu/about/overview/rates/">Latest recharging rates&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://hpcc.ucr.edu/manuals/access/login/">Log in instructions&lt;/a>&lt;/li>
&lt;/ul>
&lt;h4 id="hpc-usage">HPC usage&lt;/h4>
&lt;ul>
&lt;li>&lt;a href="https://hpcc.ucr.edu/manuals/">Usage instructions&lt;/a> are provided in the manual section&lt;/li>
&lt;li>To efficiently navigate the Manuals pages, use the &lt;a href="https://raw.githubusercontent.com/ucr-hpcc/ucr-hpcc.github.io/master/static/img/Manual_Navigation.png">Manual dropdown&lt;/a> in the menu bar on the top of this site.&lt;/li>
&lt;li>&lt;a href="https://hpcc.ucr.edu/events/events/">Event schedule&lt;/a> for workshops and user meetings&lt;/li>
&lt;/ul>
&lt;h4 id="infrastructure-description">Infrastructure description&lt;/h4>
&lt;ul>
&lt;li>&lt;a href="https://hpcc.ucr.edu/about/hardware/overview/">Infrastructure description&lt;/a> of HPCC&amp;rsquo;s clusters and parallel storage systems&lt;/li>
&lt;li>&lt;a href="https://goo.gl/43eOwQ">Facility description document&lt;/a> for grant applications and related purposes&lt;/li>
&lt;/ul>
&lt;h4 id="help-and-contacts">Help and contacts&lt;/h4>
&lt;ul>
&lt;li>For Support please consider asking on &lt;a href="https://community.hpcc.ucr.edu/">our Forums&lt;/a>.&lt;/li>
&lt;li>For requesting user accounts, password help, or software requests, please email &lt;a href="mailto:support@hpcc.ucr.edu">support@hpcc.ucr.edu&lt;/a>.&lt;/li>
&lt;li>You can also join our Slack at &lt;a href="https://ucr-hpcc.slack.com/">ucr-hpcc&lt;/a>.&lt;/li>
&lt;li>&lt;a href="https://hpcc.ucr.edu/about/overview/people/">Contact information&lt;/a>&lt;/li>
&lt;/ul></description></item><item><title>About: Access</title><link>https://hpcc.ucr.edu/about/overview/access/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://hpcc.ucr.edu/about/overview/access/</guid><description>
&lt;h2 id="user-account-requests">User account requests&lt;/h2>
&lt;p>To create new user or lab accounts, please follow these instructions:&lt;/p>
&lt;ul>
&lt;li>Please email user account requests to &lt;a href="mailto:support@hpcc.ucr.edu">support@hpcc.ucr.edu&lt;/a>. Include the full name, NetID and email address of both users and PI. Users need to be members of the PI&amp;rsquo;s group. Preferentially, user account requests should come from the corresponding PI directly. If the request comes from a new user then the PI needs to be CC&amp;rsquo;ed in the email exchange.&lt;/li>
&lt;li>If a PI&amp;rsquo;s lab is not registered yet, please provide in the same email a COA (formerly FAU) required to pay for the annual subscription fee, the email of your financial advisor, and optionally any additional data storage (see &lt;a href="https://hpcc.ucr.edu/about/overview/access/#recharging-rates">here&lt;/a>). If additional storage is needed, mention how much and the COA to be used for the additional recharge.&lt;/li>
&lt;/ul>
&lt;p>After receiving the access information for a new account, users want to follow the login instructions &lt;a href="../../manuals/login">here&lt;/a>.&lt;/p>
&lt;h2 id="recharging-rates">Recharging rates&lt;/h2>
&lt;p>HPCC&amp;rsquo;s recharging rate structure is outlined below. A more formal summary is available in the most recent &lt;em>Recharging Rate PDF&lt;/em> &lt;a href="../../about/facility/rates">here&lt;/a>.&lt;/p>
&lt;h2 id="pi-based-registration-fee">PI-based Registration Fee&lt;/h2>
&lt;p>An annual registration fee of $1,000 gives all members of a UCR lab access to our high-performance computing infrastructure.
The registration provides access to the following resources:&lt;/p>
&lt;ul>
&lt;li>Over 16,000 CPU cores (60% AMD and 40% Intel), ~230,000 cuda cores (Nvidia A100, P100 and K80 GPUs), ~5PB parallel GPFS-based disk space, 512GB-1TB of memory/node, etc. More details are available on the &lt;a href="https://hpcc.ucr.edu/about/hardware/">hardware pages&lt;/a>.&lt;/li>
&lt;li>Over 1000 software packages and community databases. Details are available on the software page.&lt;/li>
&lt;li>Free attendance of workshops offered by HPCC staff&lt;/li>
&lt;li>Free consultation services (up to 1 hour per month)&lt;/li>
&lt;li>Note: there is no extra charge for CPU usage but each user and lab have CPU quotas of 384 and 768 CPU cores, respectively. Computing jobs exceeding these quotas can be submitted but will stay in a queued state until resources within the quota limits become available.&lt;/li>
&lt;/ul>
&lt;h2 id="big-data-storage">Big data storage&lt;/h2>
&lt;p>For data storage the HPCC uses a central parallel GPFS storage system that
scales to many thousands of TBs. This high-availability storage system is
directly attached (mounted) to all its CPU and GPU nodes, meaning users can
immediately process their data with high-performance computing hardware without
moving them from one location (&lt;em>e.g.&lt;/em> a data archival system) to another.&lt;/p>
&lt;ul>
&lt;li>
&lt;p>Rented big data storage&lt;/p>
&lt;ul>
&lt;li>
&lt;p>Standard user accounts have a storage quota of 20 GB. To gain access to much larger storage pools, PIs have the option to rent or own storage space.&lt;/p>
&lt;/li>
&lt;li>
&lt;p>Storage rental option&lt;/p>
&lt;ul>
&lt;li>$1000 per 10TB of usable and backed up storage space per year. Smaller units than 10TB are also available (&lt;em>e.g.&lt;/em> 100GB units). For details see &lt;a href="https://hpcc.ucr.edu/about/overview/rates/">here&lt;/a>. In comparison, the maintenance cost for the same amount of owned storage is $260 per year (see below).&lt;/li>
&lt;li>Since the HPCC backs up all user data and uses snapshotting as an additional data security measure, 10TB of usable backed up space is the equivalent of almost 30TB of raw disk space. Thus, the cost for rented storage is $33.33 for 1TB/yr raw disk space.
&amp;laquo;&amp;laquo;&amp;laquo;&amp;lt; HEAD&lt;/li>
&lt;li>User account and big data backups are performed monthly and stored long-term, or as long as users maintain their storage subscriptions and/or owned hard drives are not older than seven years. To prevent the accumulation of unwanted data which is very costly, any data deleted by users in their user account or bigdata will also be removed from the backup system. To retrieve recent unwanted changes to data, the snapshotting system can be used. This allows retrieval of recently deleted files.&lt;br>
=======&lt;/li>
&lt;li>User account and big data backups are performed monthly and stored long-term, or as long as users maintain their storage subscriptions and/or owned hard drives are not older than seven years. To prevent the accumulation of unwanted data which is very costly, any data deleted by users in their user account or bigdata will also be removed from the backup system. To retrieve recent unwanted changes to data, previous snapshots can be used assuming data was there before the snapshot. This allows retrieval of recently deleted files. More info on snapshots can be found &lt;a href="https://hpcc.ucr.edu/manuals/hpc_cluster/storage/#automatic-backups-and-snapshots">here&lt;/a>.&lt;/li>
&lt;/ul>
&lt;/li>
&lt;/ul>
&lt;/li>
&lt;/ul>
&lt;blockquote>
&lt;blockquote>
&lt;blockquote>
&lt;blockquote>
&lt;blockquote>
&lt;blockquote>
&lt;blockquote>
&lt;p>f1a24833b7dd64bb693e75e18dc093f8a07fe41a
* The rented storage pool can be shared among all user accounts of a registered lab.&lt;/p>
&lt;/blockquote>
&lt;/blockquote>
&lt;/blockquote>
&lt;/blockquote>
&lt;/blockquote>
&lt;/blockquote>
&lt;/blockquote>
&lt;h2 id="ownership-models">Ownership models&lt;/h2>
&lt;ul>
&lt;li>
&lt;p>Owned big data storage&lt;/p>
&lt;ul>
&lt;li>A lab/PI purchases storage hardware (&lt;em>e.g.&lt;/em> hard drives) according to the specifications of the facility. Owned hard drives will be added to the facility&amp;rsquo;s parallel GPFS storage systems including production and backup storage. There is no extra charge for the additional storage infrastructure required for operation, including hard drive enclosures (servers) and high-speed network. The annual support fee for owned disk storage is $260 per 10TB of usable and backed up storage space. Since we back everything up to a secondary server room and use snapshotting as an additional data security measure, 10TB of usable backed up space is the equivalent of almost 30TB of raw disk space. Thus, the maintenance cost for owned storage is $8.67 for 1TB/yr raw disk space. Note, owned storage space is only available to the users of a PI&amp;rsquo;s group or those a PI wishes to give access to.&lt;/li>
&lt;li>The owned storage pool can be shared among all user accounts of a registered lab.&lt;/li>
&lt;li>Owned storage can be attractive for labs with storage needs above 40TBs. For smaller amounts the rental option is often a better and more flexible choice (&lt;em>e.g.&lt;/em> available within a few days).&lt;/li>
&lt;li>Due to the pooled nature of our storage, PIs should not expect the ability of retrieving the individual drives purchased for cluster use, nor that their data is stored on their exact drives. GPFS is a shared pool of storage, and data will end up stored across multiple disks.&lt;/li>
&lt;/ul>
&lt;/li>
&lt;/ul>
&lt;p>&lt;/p>
&lt;ul>
&lt;li>
&lt;p>Computer nodes&lt;/p>
&lt;ul>
&lt;li>A lab/PI purchases compatible computer nodes (e.g. with supported network cards). Examples of popular high-density architecture are quad node systems shown &lt;a href="https://www.thinkmate.com/systems/servers/hdx">here&lt;/a>. A quad node system includes 4 nodes where each node can be configured, for example, with two 64 core AMD or Intel chips (providing 128 cores per node or 512 cores per quad node system), 1,024GB of RAM, 2TB SSD and NDR-IB interconnect (&lt;a href="https://www.gigabyte.com/us/Enterprise/High-Density-Server">additional example&lt;/a>). Similar options are available for &lt;a href="https://www.thinkmate.com/systems/servers/gpx">GPU&lt;/a> nodes.&lt;/li>
&lt;li>Nodes are administered under a priority queueing system that gives users from an owner lab priority and also increases that lab&amp;rsquo;s overall CPU quota (see above) by the number of owned CPU cores.&lt;/li>
&lt;li>Owned computer nodes are an attractive solution for labs requiring 24/7 access to hundreds of CPU cores with no or only minor waiting times in queue.&lt;/li>
&lt;/ul>
&lt;/li>
&lt;/ul>
&lt;h2 id="software-install">Software install&lt;/h2>
&lt;ul>
&lt;li>Registered users can email software install requests to HPCC&amp;rsquo;s issue tracking system @ &lt;a href="mailto:support@hpcc.ucr.edu">support@hpcc.ucr.edu&lt;/a>. Install requests are addressed in the order received. Simple installs are addressed within 1 to a few days. Complex installs may take longer.&lt;/li>
&lt;/ul>
&lt;!---
## Startup packages for new PIs
Startup packages are available for variable numbers and architectures of HPC nodes and storage amounts. This includes the following components:
Standard startup packages in the amount of $20K (N=1), $30K (N=2), $40K (N=3)
and so on are available. Note: N refers to the number of HPC nodes below. The
cost for these packages can be covered by the initial complement of new PIs.
* N HPC node(s): owned by lab for 5 yrs and administered under priority queueing model. After this time the node becomes part of the shared HPCC cluster resources.
* Each node with 32* Intel CPU cores (64* logical cores), 512GB RAM and Infiniband interconnect. *The core numbers might nearly double when newer and less expensive Intel chip sets will be released this year. However, the per node cost may be subject to rapid changes (e.g the cost of RAM has increased by several fold in last year).
* Alternative node architecture (_e.g._ GPU) are available upon request
* Owned HPC nodes with various CPU/GPU architectures, RAM and SSD specifications. Pricing is comptetitive, but will greatly depend on the current market value of HPC components, custom configurations and discounts provided by vendors.
* Rented big data storage @ $1000 for 10TB per yr covered for 5 yrs; or owned disk storage when storage needs are above 20TB
* HPCC subscription fee of $1000/yr covered for 5 yrs
To configure a startup HPC package, please contact the facility staff directly.
-->
&lt;h2 id="department-cluster-membership-with-owned-computing-nodes">Department cluster membership with owned computing nodes&lt;/h2>
&lt;p>This option addresses the need of department-level HPC access where the standard
PI-based membership is not practical, &lt;em>e.g.&lt;/em> provide cluster access to large number of undergraduate
students in classes. Under this model a department purchases computer nodes
that will be administered similarly as described above under the &lt;em>Ownership
model&lt;/em>. Due to the large number of expected users from departments, the
CPU quota per user is usually lower compared to the PI-based model.&lt;/p>
&lt;h2 id="using-hpcc-cluster-for-classes">Using HPCC cluster for classes&lt;/h2>
&lt;p>To use the HPCC cluster for teaching UCR classes, please coordinate with the
systems administrators (&lt;a href="mailto:support@hpcc.ucr.edu">support@hpcc.ucr.edu&lt;/a>) at least 4 weeks prior to the
start of a class so that there is enough time for planning. Details that need
be discussed includes the number of user accounts required, special software
requirements, creation of a class-specific Slurm partition, data storage
reservations, as well as other needs that may vary for different classes.&lt;/p>
&lt;h2 id="external-user-accounts">External user accounts&lt;/h2>
&lt;p>Accounts for external customers can only be granted if a lab has a strong
affiliation with UC Riverside, such as a research collaboration with UCR
researchers. Both the corresponding UCR PI and external collaborator need to
maintain an HPCC subscription. External accounts are subject to an annual
review and approval process. To be approved, the external and internal PIs have
to complete this &lt;a href="https://bit.ly/32O1lC9">External Usage Justification&lt;/a>.&lt;/p>
&lt;h2 id="trial-accounts">Trial Accounts&lt;/h2>
&lt;p>If you&amp;rsquo;re not sure if the HPCC might work for your use case, we can offer limited-compute
accounts to allow for you to test the performance and compatability of your software.
A limited number of trial accounts can be created for a lab, with access to limited compute
compared to a standard account. Please reach out to HPCC support if you&amp;rsquo;re interested in a
trial account.&lt;/p>
&lt;h2 id="facility-description">Facility description&lt;/h2>
&lt;ul>
&lt;li>The latest hardware/facility description (&lt;em>e.g.&lt;/em> for grant applications) is available &lt;a href="https://goo.gl/43eOwQ">here&lt;/a>.&lt;/li>
&lt;/ul></description></item><item><title>About: Activity Report</title><link>https://hpcc.ucr.edu/about/overview/activity/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://hpcc.ucr.edu/about/overview/activity/</guid><description>
&lt;h2 id="summary-report">Summary report&lt;/h2>
&lt;!--
The following activity report is generated by `Grafana` and refreshed on this page every 10 minutes.
It summarizes CPU cluster activity on HPCC's computing resources including Intel, Batch, Highmem and GPU partitions:
-->
&lt;p>Do note that just because resources appear to be idle might not mean that they can be immeditally allocated to a job. There are many factors at play, such as priority, requested runtimes, number of resources requested, etc. that contribute to how jobs are queued. If you&amp;rsquo;re concerned about your jobs being queued for extended periods of time, please see the &lt;a href="https://hpcc.ucr.edu/manuals/hpc_cluster/queue/">Queue Policies&lt;/a> page or reach out to support.&lt;/p>
&lt;iframe src="https://data.hpcc.ucr.edu/public-dashboards/68d2e4f765fd4bcc825f3e7fb423eb3f" style="text-align:center;width:100%;height:75vh;">&lt;/iframe>
&lt;!--
&lt;p style="text-align: center;">&lt;font color="red">Click image to enlarge!&lt;/font>&lt;/p>
&lt;div>
&lt;a href="https://cluster.hpcc.ucr.edu/activity-report/">
&lt;img alt="intel_part" border="0" src="https://cluster.hpcc.ucr.edu/activity-report/pane2.png" style="display:block;margin-right:auto;margin-left:auto;text-align:center;">
&lt;img alt="batch_part" border="0" src="https://cluster.hpcc.ucr.edu/activity-report/pane3.png" style="display:block;margin-right:auto;margin-left:auto;text-align:center;">
&lt;img alt="highmem_part" border="0" src="https://cluster.hpcc.ucr.edu/activity-report/pane4.png" style="display:block;margin-right:auto;margin-left:auto;text-align:center;">
&lt;img alt="gpu_part" border="0" src="https://cluster.hpcc.ucr.edu/activity-report/pane5.png" style="display:block;margin-right:auto;margin-left:auto;text-align:center;">
&lt;/a>
&lt;/div>
--></description></item><item><title>About: Rates</title><link>https://hpcc.ucr.edu/about/overview/rates/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://hpcc.ucr.edu/about/overview/rates/</guid><description>
&lt;h2 id="facility-description">Facility description&lt;/h2>
&lt;ul>
&lt;li>&lt;a href="https://goo.gl/43eOwQ">Facility description&lt;/a> (&lt;em>e.g.&lt;/em> for grant applications)&lt;/li>
&lt;/ul>
&lt;h2 id="recharging-rates">Recharging rates&lt;/h2>
&lt;ul>
&lt;li>&lt;a href="https://docs.google.com/document/d/19MTJSkKeqhz6QVmOrGfkuK7gvwRbRjTEpnZgx8EFQDM/edit?usp=sharing">Recharging rates: 2025/2026&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://rebrand.ly/da3b3z3">Recharging rates: 2024/2025&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://rb.gy/wszgj">Recharging rates: 2023/2024&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://bit.ly/3IZUAQQ">Recharging rates: 2022/2023&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://bit.ly/3iPkbiv">Recharging rates: 2021/2022&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://bit.ly/3jeK3nF">Recharging rates: 2020/2021&lt;/a>&lt;/li>
&lt;li>&lt;a href="http://bit.ly/2ZWbND7">Recharging rates: 2019/2020&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://goo.gl/1mVfLM">Recharging rates: 2018/2019&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://goo.gl/QjJgzu">Recharging rates: 2017/2018&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://goo.gl/jJWpon">Recharging rates: 2016/2017&lt;/a>&lt;/li>
&lt;/ul>
&lt;h2 id="purchased-disk-information">Purchased Disk Information&lt;/h2>
&lt;p>To achieve our desired balance of performance and density in our storage systems
we must place some restrictions on the lifetime of purchased storage. Details can
be found in our &lt;a href="https://docs.google.com/document/d/1Up48pPWidYAN0wHsFiiqGQ676C-lPKinqBV_kXKSF0k/edit">Owned Storage Information&lt;/a>
document&lt;/p>
&lt;h2 id="ownership-models">Ownership Models&lt;/h2>
&lt;p>For more information relating to owned storage, see the &lt;a href="https://hpcc.ucr.edu/about/overview/access/#ownership-models">Ownership models&lt;/a> section of the Access page.&lt;/p>
&lt;h2 id="external-user-accounts">External user accounts&lt;/h2>
&lt;p>Accounts for external customers can only be granted if a lab has a strong
affiliation with UC Riverside, such as a research collaboration with UCR
researchers. Both the corresponding UCR PI and external collaborator need to
maintain an HPCC subscription. External accounts are subject to an annual
review and approval process. To be approved, the external and internal PIs have
to complete this &lt;a href="https://bit.ly/32O1lC9">External Usage Justification&lt;/a>.&lt;/p></description></item><item><title>About: Contact</title><link>https://hpcc.ucr.edu/about/overview/contact/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://hpcc.ucr.edu/about/overview/contact/</guid><description>
&lt;h2 id="support">Support&lt;/h2>
&lt;p>For cluster support, see the &lt;a href="https://hpcc.ucr.edu/about/overview/introduction/#help-and-contacts">Help and Contacts&lt;/a> section of our Introduction.&lt;/p>
&lt;h2 id="facility">Facility&lt;/h2>
&lt;p>Please do not contact the System Administrators directly for general cluster/software support. Our support email (support [at] hpcc.ucr.edu) will make sure the most eyes see the ticket and prevent it from getting lost in cluttered email inboxes.&lt;/p>
&lt;ul>
&lt;li>&lt;a href="mailto:aleon008@ucr.edu">Austin Leong&lt;/a>, Sr. HPC Systems Administrator&lt;/li>
&lt;li>&lt;a href="mailto:emerson.jacobson@ucr.edu">Emerson Jacobson&lt;/a>, HPC Systems Administrator&lt;/li>
&lt;li>&lt;a href="http://girke.bioinformatics.ucr.edu">Thomas Girke&lt;/a>, Director of HPC Center&lt;/li>
&lt;/ul>
&lt;h2 id="advisory-board-executive-committee">Advisory Board (executive committee)&lt;/h2>
&lt;p>The responsibilities of the Advisory Board are outlined &lt;a href="https://goo.gl/X3p1VK">here&lt;/a>.&lt;/p>
&lt;ul>
&lt;li>
&lt;p>Faculty members&lt;/p>
&lt;ul>
&lt;li>Jason E Stajich (Microbiology &amp;amp; Plant Pathology)&lt;/li>
&lt;li>Wenxiu Ma (Statistics)&lt;/li>
&lt;li>Stefano Lonardi (CSE)&lt;/li>
&lt;li>Mark Alber (Mathematics)&lt;/li>
&lt;li>Adam Godzik (Biomedical Sciences)&lt;/li>
&lt;li>Laura Sales (Physics)&lt;/li>
&lt;li>Ahmed Eldawy (CSE)&lt;/li>
&lt;li>Xinping Cui (Statistics)&lt;/li>
&lt;li>Leonard Mueller (Chemistry)&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>
&lt;p>HPC expert staff members from UCR&lt;/p>
&lt;ul>
&lt;li>Keith Richards-Dinger (Earth Sciences)&lt;/li>
&lt;li>Victor Hill (CS)&lt;/li>
&lt;li>Bill Strossman (C&amp;amp;C)&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>
&lt;p>External members from academia and industry&lt;/p>
&lt;ul>
&lt;li>One of each to be added here.&lt;/li>
&lt;/ul>
&lt;/li>
&lt;/ul>
&lt;h2 id="office-location-and-mailing-address">Office location and mailing address&lt;/h2>
&lt;p>1208/1207 Genomics Building (&lt;a href="https://goo.gl/OVKyxv">Google Map&lt;/a>)
3401 Watkins Drive
University of California
Riverside, CA 92521&lt;/p>
&lt;h2 id="server-rooms">Server rooms&lt;/h2>
&lt;h3 id="genomics">Genomics&lt;/h3>
&lt;p>HPCC&amp;rsquo;s main server room is in the Genomics Building, Rm 1120A.&lt;/p>
&lt;h3 id="colo-server-room">CoLo server room&lt;/h3>
&lt;p>School of Medicine CoLo&lt;/p>
&lt;h2 id="help">Help&lt;/h2>
&lt;p>For questions or requesting new user accounts please email &lt;a href="mailto:support@hpcc.ucr.edu">support@hpcc.ucr.edu&lt;/a>.&lt;/p></description></item><item><title>About: Acknowledgement of Facility</title><link>https://hpcc.ucr.edu/about/overview/acknowledgement/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://hpcc.ucr.edu/about/overview/acknowledgement/</guid><description>
&lt;h2 id="acknowledgement-in-publications">Acknowledgement in Publications&lt;/h2>
&lt;p>We appreciate that you have chosen our facility to support your research and
would like to remind you to add the following statement to acknowledge the
High-Performance Computing Center in your publications and presentations.&lt;/p>
&lt;div class="pageinfo pageinfo-primary">
&lt;p>Computations were performed using the computer clusters and data storage
resources of the HPCC, which were funded by grants from NSF (MRI-2215705, MRI-1429826) and
NIH (1S10OD016290-01A1).&lt;/p>
&lt;/div>
&lt;p>Your success is important to us and we would appreciate if you could share with us the references or URLs to any
publications or presentations that used our facilty.&lt;/p>
&lt;h2 id="grants">Grants&lt;/h2>
&lt;ul>
&lt;li>&lt;a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=2215705">NSF MRI-2215705&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=1429826">NSF MRI-1429826&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://federalreporter.nih.gov/Projects/Details/?projectId=624283&amp;amp;ItemNum=881394&amp;amp;totalItems=892504&amp;amp;searchId=b850241613a74a58962c0bd1a1edd5d4&amp;amp;searchMode=Smart&amp;amp;page=8814&amp;amp;pageSize=100&amp;amp;sortField=Ic&amp;amp;sortOrder=asc&amp;amp;filters=&amp;amp;navigation=True">NIH 1S10OD016290-01A1&lt;/a>&lt;/li>
&lt;/ul></description></item><item><title>About:</title><link>https://hpcc.ucr.edu/about/overview/slides_backup/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://hpcc.ucr.edu/about/overview/slides_backup/</guid><description>
&lt;p>#&amp;mdash;
#type: docs
#linkTitle: Overview slides
#title: Overview slides
#weight: 6
#&amp;mdash;&lt;/p>
&lt;iframe src="https://docs.google.com/presentation/d/e/2PACX-1vQIuy-2Z50zj3wr5dNjyes5tnUjGP84vUBn2vFxM5y5qb_kCOpWfjKu_G-F9a-46JniTsgVWWmQn_9m/embed?start=false&amp;loop=false&amp;delayms=3000" frameborder="0" width="960" height="569" allowfullscreen="true" mozallowfullscreen="true" webkitallowfullscreen="true">&lt;/iframe></description></item></channel></rss>