[one-users] Remotes Directory
Greg Stabler
gstable at clemson.edu
Thu Apr 5 10:06:31 PDT 2012
I'm trying to add our high-performance cluster to our OpenNebula 3.0 system
to run VMs on the cluster. Due to security requirements on the cluster, we
have had to write several wrapper scripts for things such as virsh,
ovs-vsctl (for Open vSwitch), and brctl to ensure that someone does not
misconfigure a cluster node and crash the system.
Therefore, the current set of OpenNebula scripts in /var/remotes will not
work on the cluster. The quick/dirty solution we came up with is to create
a second version of the remotes scripts that uses our wrappers and place
them in /var/remotes_cluster. Whenever the remote scripts are copied to or
updated on a cluster node host, we would use this directory instead of
/var/remotes.
I've been looking through the source code and trying to figure out where I
might be able to add some code that checks to see if hostname belongs to
our cluster and then use the remotes_cluster directory, and if not, use the
normal remotes directory. I thought that maybe in update_remotes() in
either one/libs/mads/one_im_exec.rb or in one/lib/ruby/CommandManager.rb
would work, but I'm having trouble with that. I tried adding in a command
to the method to create a simple test directory just to see if that was the
correct place in the code, but I never see the directory created on my
nodes. Any ideas? Again, we're using OpenNebula 3.0.
Thank you,
Greg Stabler
Computer Science/MS Candidate
School of Computing, McAdams 120
Clemson University, Clemson, SC 29634
gstable at clemson.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.opennebula.org/pipermail/users-opennebula.org/attachments/20120405/51bad072/attachment-0002.htm>
More information about the Users
mailing list