I'm trying to add our high-performance cluster to our OpenNebula 3.0
system to run VMs on the cluster. Due to security requirements on the
cluster, we have had to write several wrapper scripts for things such as
virsh, ovs-vsctl (for Open vSwitch), and brctl to ensure that someone
does not misconfigure a cluster node and crash the system. <br>
<br>Therefore, the current set of OpenNebula scripts in /var/remotes
will not work on the cluster. The quick/dirty solution we came up with
is to create a second version of the remotes scripts that uses our
wrappers and place them in /var/remotes_cluster. Whenever the remote
scripts are copied to or updated on a cluster node host, we would use
this directory instead of /var/remotes. <br>
<br>I've been looking through the source code and trying to figure out
where I might be able to add some code that checks to see if hostname
belongs to our cluster and then use the remotes_cluster directory, and
if not, use the normal remotes directory. I thought that maybe in
update_remotes() in either one/libs/mads/one_im_exec.rb or in
one/lib/ruby/CommandManager.rb would work, but I'm having trouble with
that. I tried adding in a command to the method to create a simple test
directory just to see if that was the correct place in the code, but I
never see the directory created on my nodes. Any ideas? Again, we're
using OpenNebula 3.0. <br>
<br>Thank you,<br>
<br style="font-family:trebuchet ms,sans-serif"><font style="font-family:verdana,sans-serif" size="2">Greg Stabler<br>Computer Science/MS Candidate<br>School of Computing, McAdams 120<br>Clemson University, Clemson, SC 29634<br>
<a href="mailto:gstable@clemson.edu" target="_blank">gstable@clemson.edu</a></font>
<br>