Edit the file /etc/ssh/sshd_config , using text editors such as vim or nano
PasswordAuthentication no ChallengeResponseAuthentication no
And then, restart the sshd service: systemctl restart sshd.service
The cloud project account will be created and login credentials provided as soon as the registration on the Cloud Database has been approved, to register for cloud resources please following the link https://openstackusers.nicis.ac.za/
An initial vm user account will be setup with sudo privileges.
User accounts / usernames are set up using the standard CHPC user account creation policy, a user’s name consists of the first letter of your name and your surname/last name.
By default projects have limited resources as specified below , unless a request for more resources has been submitted and approved:
The Initial computing resources are determined by a selected resource flavor, each project is allowed a maximum of 400GB of storage by default.
The storage options are CEPH and local storage. The local storage comes with/derived from the selected flavor when launching a virtual machine (VM). The Ceph storage is from creating a separate volume attachable to your VM. **Note : Ceph = 400GB - local storage**
Should more resources be required when the project has reached the usage limit, project owners should apply for the resource top up by uploading their motivation on the cloud database for a series of approvals.
Users should please note: the allocation of more resources is subjected to resource availability thus not guaranteed.
VM snapshots are provided on request.
For security reasons as passwords are subjected to compromise, please add/use ssh keys for adding ssh keys to your vm, follow the getting started
For paying customers/users the invoices for the previous calendar month are sent on the 5th of every month. If no invoice received on the 5th please email CHPC on firstname.lastname@example.org
CHPC does not provide windows licences, users requesting windows images should produce/ provide their own windows license
To minimize/ prevent data loss CHPC has implemented a ceph storage running on the chpc production cloud that is setup to keep 3 replicas of each set of data on each storage node.