Note: This document is based on Jelastic version 4.7
Mount points functionality is intended for managing connected to a server data storage containers. At the same-named section of the inbuilt Configuration Manager, you can define the directories that are physically located on other, remote node(s), but should be accessible at a current container.
As a storage for your data, any of the containers within your account (where the best option is Dedicated Storage Container) or external NFS server can be used. Herewith, upon choosing Jelastic container, all of the needed configurations (i.e. installation of the dependent NFS software and setting export parameters for specified files) are applied automatically, without any manual manipulations required. This makes a great amount of scenarios and possibilities for the advanced data structuration and control available.
- legacy containers, built on top of CentOS 5 template (i.e. created before the Jelastic 2.0 version), do not support the data storage server/client functionality
- Docker containers, based on alpine or Ubuntu (under 14.04) OS distributions, can only be used as a data storage client due to the NFS server functionality being unsupported
So, in order to mount data to a container, click the Config button next to the chosen container or layer. In the opened Configuration Manager tab, switch to the Mount Points section within menu to the left:
Then select the Mount button at the top pane and choose one of the following operation types (switch to the required doc below to get the further details):
- mount points functionality is provided for all Jelastic nodes; however, if operating with Docker-based containers, you can also use the volumes section to manage your mounts, where the additional option of local storing is available. In this case, the data will remain unaffected during container’s lifecycle even being stored in the same container
- within the Exports section (one item down in the left-hand menu), you can manage the list of folders and review files, that are shared by current server (i.e. data, other nodes have the access to)