[one-users] econe-server problems with 3.2.1
Ulrich Schwickerath
ulrich.schwickerath at cern.ch
Wed Feb 8 03:59:53 PST 2012
Hi, Daniel,
sure, here it is. I have 3 of these guys now. Never seen that before.
[lsfadmin at oneadmin02 ~]$ onevm show 22976 -x
<VM>
<ID>22976</ID>
<UID>7</UID>
<GID>102</GID>
<UNAME>lsfadmin</UNAME>
<GNAME>batch</GNAME>
<NAME>LXBATCH</NAME>
<PERMISSIONS>
<OWNER_U>1</OWNER_U>
<OWNER_M>1</OWNER_M>
<OWNER_A>0</OWNER_A>
<GROUP_U>0</GROUP_U>
<GROUP_M>0</GROUP_M>
<GROUP_A>0</GROUP_A>
<OTHER_U>0</OTHER_U>
<OTHER_M>0</OTHER_M>
<OTHER_A>0</OTHER_A>
</PERMISSIONS>
<LAST_POLL>0</LAST_POLL>
<STATE>3</STATE>
<LCM_STATE>0</LCM_STATE>
<STIME>1328463781</STIME>
<ETIME>0</ETIME>
<DEPLOY_ID/>
<MEMORY>0</MEMORY>
<CPU>0</CPU>
<NET_TX>0</NET_TX>
<NET_RX>0</NET_RX>
<TEMPLATE>
<CONTEXT>
<AFS><![CDATA[on]]></AFS>
<AFSCACHE><![CDATA[vdc]]></AFSCACHE>
<EC2_IMID><![CDATA[ami-00000023]]></EC2_IMID>
<EC2_USER_DATA><![CDATA[RUMyX1NFQ1JFVF9LRVk9OWNkNWU5Mzg5NjVjM2M0MzcyMDNhMTA1ODkzMmU4YTViYjY5MzdjZQpFQzJfVVJMPWh0dHBzOi8vb25lYWRtaW4wMi5jZXJuLmNoOjg0NDMKRUMyX0FDQ0VTU19LRVk9bHNmYWRtaW4K]]></EC2_USER_DATA>
<EC2_VMID><![CDATA[i-22976]]></EC2_VMID>
<FILES><![CDATA[/home/lsfadmin/contextualization/common/opennebula.conf
/home/lsfadmin/contextualization/common/prolog.sh
/home/lsfadmin/contextualization/common/epilog.sh
/home/lsfadmin/contextualization/common/etchosts
/home/lsfadmin/contextualization/common/etcsysconfigifcfg
/home/lsfadmin/contextualization/context.lxbatch/lsfcontext.conf
/home/lsfadmin/contextualization/common/etcsysconfignetwork
/home/lsfadmin/contextualization/common/etcsysconfigafs]]></FILES>
<GOLDENNODE><![CDATA[vm64slc5]]></GOLDENNODE>
<POOL><![CDATA[vdd]]></POOL>
<TARGET><![CDATA[vdb]]></TARGET>
<TTL><![CDATA[48]]></TTL>
<VMID><![CDATA[22976]]></VMID>
</CONTEXT>
<DISK>
<BUS><![CDATA[virtio]]></BUS>
<CLONE><![CDATA[YES]]></CLONE>
<DISK_ID><![CDATA[0]]></DISK_ID>
<IMAGE><![CDATA[SLC5 glExec WN]]></IMAGE>
<IMAGE_ID><![CDATA[23]]></IMAGE_ID>
<READONLY><![CDATA[NO]]></READONLY>
<SAVE><![CDATA[NO]]></SAVE>
<SOURCE><![CDATA[/dev/xen_vg/glExecWN_slc5_x86_64_kvm]]></SOURCE>
<TARGET><![CDATA[vda]]></TARGET>
<TYPE><![CDATA[DISK]]></TYPE>
</DISK>
<DISK>
<BUS><![CDATA[virtio]]></BUS>
<DISK_ID><![CDATA[1]]></DISK_ID>
<READONLY><![CDATA[no]]></READONLY>
<SOURCE><![CDATA[/test-dev/xen_vg/afscache-]]></SOURCE>
<TARGET><![CDATA[vdc]]></TARGET>
<TYPE><![CDATA[block]]></TYPE>
</DISK>
<DISK>
<BUS><![CDATA[virtio]]></BUS>
<DISK_ID><![CDATA[2]]></DISK_ID>
<READONLY><![CDATA[no]]></READONLY>
<SOURCE><![CDATA[/test-dev/xen_vg/pool-]]></SOURCE>
<TARGET><![CDATA[vdd]]></TARGET>
<TYPE><![CDATA[block]]></TYPE>
</DISK>
<DISK>
<BUS><![CDATA[virtio]]></BUS>
<DISK_ID><![CDATA[3]]></DISK_ID>
<READONLY><![CDATA[no]]></READONLY>
<SOURCE><![CDATA[/test-dev/xen_vg/cvmfs-]]></SOURCE>
<TARGET><![CDATA[vde]]></TARGET>
<TYPE><![CDATA[block]]></TYPE>
</DISK>
<IMAGE_ID><![CDATA[ami-00000023]]></IMAGE_ID>
<INSTANCE_TYPE><![CDATA[batchslc5.small]]></INSTANCE_TYPE>
<MEMORY><![CDATA[2560]]></MEMORY>
<NAME><![CDATA[LXBATCH]]></NAME>
<NIC>
<BRIDGE><![CDATA[br0]]></BRIDGE>
<IP><![CDATA[128.142.135.102]]></IP>
<MAC><![CDATA[00:16:3e:00:4b:5d]]></MAC>
<MODEL><![CDATA[virtio]]></MODEL>
<NETWORK><![CDATA[LXBATCHT]]></NETWORK>
<NETWORK_ID><![CDATA[3]]></NETWORK_ID>
<VLAN><![CDATA[NO]]></VLAN>
</NIC>
<OS>
<BOOTLOADER><![CDATA[/usr/bin/pygrub]]></BOOTLOADER>
</OS>
<RANK><![CDATA[FREEMEM]]></RANK>
<RAW>
<DATA><![CDATA[
<devices>
<serial type="pty">
<target port="0"/>
</serial>
<console type="pty">
<target port="0"/>
</console>
<input type='mouse' bus='ps2'/>
<graphics type='vnc' port='5905' autoport='yes' keymap='en-us'/>
<video>
<model type='cirrus' vram='9216' heads='1'/>
<alias name='video0'/>
</video>
</devices>]]></DATA>
<TYPE><![CDATA[kvm]]></TYPE>
</RAW>
<REQUIREMENTS><![CDATA[MACS="*00:16:3e:00:4b:5d*"]]></REQUIREMENTS>
<VCPU><![CDATA[1]]></VCPU>
<VMID><![CDATA[22976]]></VMID>
</TEMPLATE>
<HISTORY_RECORDS>
<HISTORY>
<SEQ>0</SEQ>
<HOSTNAME>lxbst0541.cern.ch</HOSTNAME>
<VM_DIR>/opt/opennebula</VM_DIR>
<HID>47</HID>
<STIME>0</STIME>
<ETIME>0</ETIME>
<VMMMAD>vmm_kvm</VMMMAD>
<VNMMAD>dummy</VNMMAD>
<TMMAD>tm_lvm</TMMAD>
<PSTIME>0</PSTIME>
<PETIME>0</PETIME>
<RSTIME>0</RSTIME>
<RETIME>0</RETIME>
<ESTIME>0</ESTIME>
<EETIME>0</EETIME>
<REASON>0</REASON>
</HISTORY>
</HISTORY_RECORDS>
</VM>
mysql> SELECT * FROM vm_pool WHERE oid=22976;
+-------+---------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+------+------+-----------+-------+-----------+---------+---------+---------+
| oid | name |
body
| uid | gid | last_poll | state | lcm_state | owner_u | group_u |
other_u |
+-------+---------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+------+------+-----------+-------+-----------+---------+---------+---------+
| 22976 | LXBATCH |
<VM><ID>22976</ID><UID>7</UID><GID>102</GID><UNAME>lsfadmin</UNAME><GNAME>batch</GNAME><NAME>LXBATCH</NAME><PERMISSIONS><OWNER_U>1</OWNER_U><OWNER_M>1</OWNER_M><OWNER_A>0</OWNER_A><GROUP_U>0</GROUP_U><GROUP_M>0</GROUP_M><GROUP_A>0</GROUP_A><OTHER_U>0</OTHER_U><OTHER_M>0</OTHER_M><OTHER_A>0</OTHER_A></PERMISSIONS><LAST_POLL>0</LAST_POLL><STATE>3</STATE><LCM_STATE>0</LCM_STATE><STIME>1328463781</STIME><ETIME>0</ETIME><DEPLOY_ID></DEPLOY_ID><MEMORY>0</MEMORY><CPU>0</CPU><NET_TX>0</NET_TX><NET_RX>0</NET_RX><TEMPLATE><CONTEXT><AFS><![CDATA[on]]></AFS><AFSCACHE><![CDATA[vdc]]></AFSCACHE><EC2_IMID><![CDATA[ami-00000023]]></EC2_IMID><EC2_USER_DATA><![CDATA[RUMyX1NFQ1JFVF9LRVk9OWNkNWU5Mzg5NjVjM2M0MzcyMDNhMTA1ODkzMmU4YTViYjY5MzdjZQpFQzJfVVJMPWh0dHBzOi8vb25lYWRtaW4wMi5jZXJuLmNoOjg0NDMKRUMyX0FDQ0VTU19LRVk9bHNmYWRtaW4K]]></EC2_USER_DATA><EC2_VMID><![CDATA[i-22976]]></EC2_VMID><FILES><![CDATA[/home/lsfadmin/contextualization/common/opennebula.conf
/home/lsfadmin/contextualization/common/prolog.sh
/home/lsfadmin/contextualization/common/epilog.sh
/home/lsfadmin/contextualization/common/etchosts
/home/lsfadmin/contextualization/common/etcsysconfigifcfg
/home/lsfadmin/contextualization/context.lxbatch/lsfcontext.conf
/home/lsfadmin/contextualization/common/etcsysconfignetwork
/home/lsfadmin/contextualization/common/etcsysconfigafs]]></FILES><GOLDENNODE><![CDATA[vm64slc5]]></GOLDENNODE><POOL><![CDATA[vdd]]></POOL><TARGET><![CDATA[vdb]]></TARGET><TTL><![CDATA[48]]></TTL><VMID><![CDATA[22976]]></VMID></CONTEXT><DISK><BUS><![CDATA[virtio]]></BUS><CLONE><![CDATA[YES]]></CLONE><DISK_ID><![CDATA[0]]></DISK_ID><IMAGE><![CDATA[SLC5
glExec
WN]]></IMAGE><IMAGE_ID><![CDATA[23]]></IMAGE_ID><READONLY><![CDATA[NO]]></READONLY><SAVE><![CDATA[NO]]></SAVE><SOURCE><![CDATA[/dev/xen_vg/glExecWN_slc5_x86_64_kvm]]></SOURCE><TARGET><![CDATA[vda]]></TARGET><TYPE><![CDATA[DISK]]></TYPE></DISK><DISK><BUS><![CDATA[virtio]]></BUS><DISK_ID><![CDATA[1]]></DISK_ID><READONLY><![CDATA[no]]></READONLY><SOURCE><![CDATA[/test-dev/xen_vg/afscache-]]></SOURCE><TARGET><![CDATA[vdc]]></TARGET><TYPE><![CDATA[block]]></TYPE></DISK><DISK><BUS><![CDATA[virtio]]></BUS><DISK_ID><![CDATA[2]]></DISK_ID><READONLY><![CDATA[no]]></READONLY><SOURCE><![CDATA[/test-dev/xen_vg/pool-]]></SOURCE><TARGET><![CDATA[vdd]]></TARGET><TYPE><![CDATA[block]]></TYPE></DISK><DISK><BUS><![CDATA[virtio]]></BUS><DISK_ID><![CDATA[3]]></DISK_ID><READONLY><![CDATA[no]]></READONLY><SOURCE><![CDATA[/test-dev/xen_vg/cvmfs-]]></SOURCE><TARGET><![CDATA[vde]]></TARGET><TYPE><![CDATA[block]]></TYPE></DISK><IMAGE_ID><![CDATA[ami-00000023]]></IMAGE_ID><INSTANCE_TYPE><![CDATA[batchslc5.small]]></INSTANCE_TYPE><MEMORY><![CDATA[2560]]></MEMORY><NAME><![CDATA[LXBATCH]]></NAME><NIC><BRIDGE><![CDATA[br0]]></BRIDGE><IP><![CDATA[128.142.135.102]]></IP><MAC><![CDATA[00:16:3e:00:4b:5d]]></MAC><MODEL><![CDATA[virtio]]></MODEL><NETWORK><![CDATA[LXBATCHT]]></NETWORK><NETWORK_ID><![CDATA[3]]></NETWORK_ID><VLAN><![CDATA[NO]]></VLAN></NIC><OS><BOOTLOADER><![CDATA[/usr/bin/pygrub]]></BOOTLOADER></OS><RANK><![CDATA[FREEMEM]]></RANK><RAW><DATA><![CDATA[
<devices>
<serial type="pty">
<target port="0"/>
</serial>
<console type="pty">
<target port="0"/>
</console>
<input type='mouse' bus='ps2'/>
<graphics type='vnc' port='5905' autoport='yes' keymap='en-us'/>
<video>
<model type='cirrus' vram='9216' heads='1'/>
<alias name='video0'/>
</video>
</devices>]]></DATA><TYPE><![CDATA[kvm]]></TYPE></RAW><REQUIREMENTS><![CDATA[MACS="*00:16:3e:00:4b:5d*"]]></REQUIREMENTS><VCPU><![CDATA[1]]></VCPU><VMID><![CDATA[22976]]></VMID></TEMPLATE><HISTORY_RECORDS><HISTORY><SEQ>0</SEQ><HOSTNAME>lxbst0541.cern.ch</HOSTNAME><VM_DIR>/opt/opennebula</VM_DIR><HID>47</HID><STIME>0</STIME><ETIME>0</ETIME><VMMMAD>vmm_kvm</VMMMAD><VNMMAD>dummy</VNMMAD><TMMAD>tm_lvm</TMMAD><PSTIME>0</PSTIME><PETIME>0</PETIME><RSTIME>0</RSTIME><RETIME>0</RETIME><ESTIME>0</ESTIME><EETIME>0</EETIME><REASON>0</REASON></HISTORY></HISTORY_RECORDS></VM>
| 7 | 102 | 0 | 3 | 0 | 1 | 0
| 0 |
+-------+---------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+------+------+-----------+-------+-----------+---------+---------+---------+
1 row in set (0.00 sec)
mysql>
Thanks a lot for your support!
Ulrich
On 02/08/2012 12:17 PM, Daniel Molina wrote:
> On 8 February 2012 08:57, Ulrich Schwickerath
> <ulrich.schwickerath at cern.ch> wrote:
>> Hi, Ruben,
>>
>> I confirm I get the same timing when I do NOT use the SSL proxy:
>>
>> (...)
>>
>> r> <td>rack.url_scheme</td> <td class="code"><div>http</div></td> </tr> <tr>
>> <td>rack.version</td> <td class="code"><div>[1, 0]</div></td> </tr> <tr>
>> <td>sinatra.error</td> <td class="code"><div>#<NoMethodError: undefined
>> method `[]' for nil:NilClass></div></td> </tr> </table> <div
>> class="clear"></div> </div> <!-- /RACK ENV --> <p id="explanation">You're
>> seeing this error because you haveenabled the<code>show_exceptions</code>
>> setting.</p> </div> <!-- /WRAP --> </body></html>
>>
>> real 1m8.893s
>> user 0m0.263s
>> sys 0m0.051s
>> [lsfadmin at lxadm10 private]$ euca-describe-images
>>
>> IMAGE ami-00000023 glExecWN_slc5_x86_64_kvm lsfadmin
>> available private i386 machine
>> IMAGE ami-00000024 glExecWN_slc6_x86_64_kvm lsfadmin
>> available private i386 machine
>> [lsfadmin at lxadm10 private]$ echo $EC2_URL
>> http://oneadmin02.cern.ch:4567
>>
>> I have something else which is strange: there are 2 VMs in my list which do
>> not have a "state" defined. I cannot delete them either:
>>
>> $ onevm list | grep -v runn
>> ID USER GROUP NAME STAT CPU MEM HOSTNAME
>> TIME
>> 22976 lsfadmin batch LXBATCH 0 0K lxbst0541.cern. 02
>> 14:12:07
>> 23467 lsfadmin batch LXBATCH 0 0K lxbst0511.cern. 00
>> 04:33:07
>> $ onevm delete 22976
>> $ onevm delete 23467
>> $ onevm list | grep -v runn
>> ID USER GROUP NAME STAT CPU MEM HOSTNAME
>> TIME
>> 22976 lsfadmin batch LXBATCH 0 0K lxbst0541.cern. 02
>> 14:12:23
>> 23467 lsfadmin batch LXBATCH 0 0K lxbst0511.cern. 00
>> 04:33:23
>>
>> Now I wonder if that might be related... I suppose I will need to
>> micky-mouse in my mysql ddb to get rid of those?
>>
> Would you mind to send the output of
> * onevm show 22976 -x
> * SELECT * FROM vm_pool WHERE oid=22976;
>
> Maybe that is the root of the problem.
>
> I have just written this patch which improves the describe_instances
> performance, it is not thoroughly tested but if it works I will
> prepare a commit and will upload it to the master branch:
> https://gist.github.com/aee5654cbe0b44bbbd51
>
> Cheers.
>
>
>
>> Thanks a lot for your support!
>>
>> Cheers,
>> Ulrich
>>
>>
>>
>> On 02/07/2012 10:50 PM, Ruben S. Montero wrote:
>>> Hi Ulrich
>>>
>>> Those in the log are not error messages but log messages. If you take
>>> a look they log a HTTP 200 return code (SUCCESS). process in 0.8 secs.
>>>
>>> Could you confirm the times accessing directly the econe-server... We
>>> believe this is a configuration issue, as we can not reproduce this.
>>> Please also *do not* revert the patches from Daniel
>>>
>>> Thanks
>>>
>>> Ruben
>>>
>>> On Mon, Feb 6, 2012 at 5:43 PM, Ulrich Schwickerath
>>> <ulrich.schwickerath at cern.ch> wrote:
>>>> Hi,
>>>>
>>>> sure. I've changed
>>>> #:server: localhost
>>>> :server: oneadmin02.cern.ch
>>>> :port: 4567
>>>>
>>>> #SSL Proxy
>>>> #:ssl_server: https://oneadmin02.cern.ch:8443/
>>>>
>>>> in /etc/one/econe.conf and restarted the server. Then I changed
>>>> export EC2_URL=http://oneadmin02.cern.ch:4567
>>>>
>>>> Access works:
>>>> $ euca-describe-images
>>>> IMAGE ami-00000023 glExecWN_slc5_x86_64_kvm lsfadmin
>>>> available private i386 machine
>>>> IMAGE ami-00000024 glExecWN_slc6_x86_64_kvm lsfadmin
>>>> available private i386 machine
>>>>
>>>> which gives
>>>> [root at oneadmin02 ~]# cat /var/log/one/econe-server.log
>>>> --------------------------------------
>>>> Server configuration
>>>> --------------------------------------
>>>> {:template_location=>"/etc/one/ec2query_templates",
>>>> :views=>"/usr/lib/one/ruby/cloud/econe/views",
>>>> :instance_types=>
>>>> {:"batchslc5.small"=>{:template=>"batchslc5.small.erb"},
>>>> :"m1.small"=>{:template=>"m1.small.erb"},
>>>> :"batchslc5.medium"=>{:template=>"batchslc5.medium.erb"},
>>>> :"m1.medium"=>{:template=>"m1.medium.erb"},
>>>> :"batchslc5.large"=>{:template=>"batchslc5.large.erb"},
>>>> :"m1.large"=>{:template=>"m1.large.erb"},
>>>> :"batchslc6.small"=>{:template=>"batchslc6.small.erb"},
>>>> :"m1.xlarge"=>{:template=>"m1.xlarge.erb"},
>>>> :"batchslc6.medium"=>{:template=>"batchslc6.medium.erb"},
>>>> :"m1.huge"=>{:template=>"m1.huge.erb"},
>>>> :"batchslc6.large"=>{:template=>"batchslc6.large.erb"}},
>>>> :auth=>"ec2",
>>>> :one_xmlrpc=>"http://localhost:2633/RPC2",
>>>> :core_auth=>"cipher",
>>>> :server=>"oneadmin02.cern.ch",
>>>> :port=>4567}
>>>> 137.138.5.252 - - [06/Feb/2012 17:35:30] "POST / HTTP/1.1" 200 742 0.8725
>>>> 137.138.5.252 - - [06/Feb/2012 17:35:30] "POST / HTTP/1.1" 200 742 0.7949
>>>>
>>>> NoMethodError - undefined method `[]' for nil:NilClass:
>>>> /usr/lib/one/ruby/cloud/econe/EC2QueryServer.rb:216:in `render_state'
>>>> (erb):20:in `describe_instances'
>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:326:in `call'
>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:326:in `each_element'
>>>>
>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:239:in
>>>> `each'
>>>>
>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:238:in
>>>> `upto'
>>>>
>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:238:in
>>>> `each'
>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:324:in `each_element'
>>>> /usr/lib/one/ruby/OpenNebula/Pool.rb:100:in `each'
>>>> (erb):14:in `describe_instances'
>>>> /usr/lib/one/ruby/cloud/econe/EC2QueryServer.rb:181:in
>>>> `describe_instances'
>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:176:in `do_http_request'
>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:158:in `POST /'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1151:in
>>>> `call'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1151:in
>>>> `compile!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:724:in
>>>> `instance_eval'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:724:in
>>>> `route_eval'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:708:in
>>>> `route!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:758:in
>>>> `process_route'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:755:in
>>>> `catch'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:755:in
>>>> `process_route'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:707:in
>>>> `route!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:706:in
>>>> `each'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:706:in
>>>> `route!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:843:in
>>>> `dispatch!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:644:in
>>>> `call!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>> `instance_eval'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>> `invoke'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>> `catch'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>> `invoke'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:644:in
>>>> `call!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:629:in
>>>> `call'
>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/head.rb:9:in `call'
>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/commonlogger.rb:18:in
>>>> `call'
>>>>
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/showexceptions.rb:21:in
>>>> `call'
>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/methodoverride.rb:24:in
>>>> `call'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1272:in
>>>> `call'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1303:in
>>>> `synchronize'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1272:in
>>>> `call'
>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/content_length.rb:13:in
>>>> `call'
>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/chunked.rb:15:in `call'
>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:84:in
>>>> `pre_process'
>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:82:in
>>>> `catch'
>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:82:in
>>>> `pre_process'
>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:57:in
>>>> `process'
>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:42:in
>>>> `receive_data'
>>>>
>>>> /usr/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in
>>>> `run_machine'
>>>>
>>>> /usr/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in
>>>> `run'
>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/backends/base.rb:61:in
>>>> `start'
>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/server.rb:159:in `start'
>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/handler/thin.rb:14:in
>>>> `run'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1234:in
>>>> `run!'
>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/main.rb:25
>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:165
>>>>
>>>> Weird. These errors are actually new. I didn't have them right after the
>>>> upgrade last week when I was mentioning the long response times.
>>>>
>>>> One thing that might give a clue: I have a cron job which queries the
>>>> system
>>>> and counts the number of running VMs. If not all leases are full,
>>>> some new batch VMs are started. This "sometimes" seems to work, however,
>>>> if
>>>> it does not or if it takes too long to respond, there are concurrent
>>>> queries
>>>> to the system (from up to 3 different client machines).
>>>>
>>>> From the output above I don't think the SSL proxy is causing the
>>>> problem.
>>>> Could it be that one of my gems is too old or buggy ?
>>>>
>>>> Thanks for your help!
>>>> Ulrich
>>>>
>>>>
>>>>
>>>>
>>>> On 02/06/2012 04:33 PM, Ruben S. Montero wrote:
>>>>> Hi
>>>>>
>>>>> Could you try interacting directly with the server (i.e. without the
>>>>> proxy part?). We are not able to reproduce this.. The server makes a
>>>>> call equivalent to a onevm list and returns the output, in our
>>>>> installation with ~500 VMs we are seeing a 1s overhead because of the
>>>>> HTTP process. But the overall response time is< 2s for the EC2
>>>>> interface and<1s from the CLI.
>>>>>
>>>>> Cheers
>>>>>
>>>>> Ruben
>>>>>
>>>>> On Mon, Feb 6, 2012 at 11:42 AM, Ulrich Schwickerath
>>>>> <ulrich.schwickerath at cern.ch> wrote:
>>>>>> Hi, all,
>>>>>>
>>>>>> we are still experiencing problems after the upgrade to ONE3.2.1. The
>>>>>> EC2
>>>>>> access is very very slow and unstable. This morning, I noticed a lot of
>>>>>> blocked requests to econe, and error message in the econe-server log
>>>>>> files:
>>>>>>
>>>>>>
>>>>>> NoMethodError - undefined method `[]' for nil:NilClass:
>>>>>> /usr/lib/one/ruby/cloud/econe/EC2QueryServer.rb:216:in `render_state'
>>>>>> (erb):20:in `describe_instances'
>>>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:326:in `call'
>>>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:326:in `each_element'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:239:in
>>>>>> `each'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:238:in
>>>>>> `upto'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:238:in
>>>>>> `each'
>>>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:324:in `each_element'
>>>>>> /usr/lib/one/ruby/OpenNebula/Pool.rb:100:in `each'
>>>>>> (erb):14:in `describe_instances'
>>>>>> /usr/lib/one/ruby/cloud/econe/EC2QueryServer.rb:181:in
>>>>>> `describe_instances'
>>>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:176:in `do_http_request'
>>>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:158:in `POST /'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1151:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1151:in
>>>>>> `compile!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:724:in
>>>>>> `instance_eval'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:724:in
>>>>>> `route_eval'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:708:in
>>>>>> `route!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:758:in
>>>>>> `process_route'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:755:in
>>>>>> `catch'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:755:in
>>>>>> `process_route'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:707:in
>>>>>> `route!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:706:in
>>>>>> `each'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:706:in
>>>>>> `route!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:843:in
>>>>>> `dispatch!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:644:in
>>>>>> `call!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `instance_eval'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `invoke'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `catch'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `invoke'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:644:in
>>>>>> `call!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:629:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/head.rb:9:in `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/commonlogger.rb:18:in
>>>>>> `call'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/showexceptions.rb:21:in
>>>>>> `call'
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/methodoverride.rb:24:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1272:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1303:in
>>>>>> `synchronize'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1272:in
>>>>>> `call'
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/content_length.rb:13:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/chunked.rb:15:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:84:in
>>>>>> `pre_process'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:82:in
>>>>>> `catch'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:82:in
>>>>>> `pre_process'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:57:in
>>>>>> `process'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:42:in
>>>>>> `receive_data'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in
>>>>>> `run_machine'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in
>>>>>> `run'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/backends/base.rb:61:in
>>>>>> `start'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/server.rb:159:in
>>>>>> `start'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/handler/thin.rb:14:in
>>>>>> `run'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1234:in
>>>>>> `run!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/main.rb:25
>>>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:165
>>>>>> NoMethodError - undefined method `[]' for nil:NilClass:
>>>>>> /usr/lib/one/ruby/cloud/econe/EC2QueryServer.rb:216:in `render_state'
>>>>>> (erb):20:in `describe_instances'
>>>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:326:in `call'
>>>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:326:in `each_element'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:239:in
>>>>>> `each'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:238:in
>>>>>> `upto'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/nokogiri-1.4.3.1/lib/nokogiri/xml/node_set.rb:238:in
>>>>>> `each'
>>>>>> /usr/lib/one/ruby/OpenNebula/XMLUtils.rb:324:in `each_element'
>>>>>> /usr/lib/one/ruby/OpenNebula/Pool.rb:100:in `each'
>>>>>> (erb):14:in `describe_instances'
>>>>>> /usr/lib/one/ruby/cloud/econe/EC2QueryServer.rb:181:in
>>>>>> `describe_instances'
>>>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:176:in `do_http_request'
>>>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:158:in `POST /'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1151:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1151:in
>>>>>> `compile!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:724:in
>>>>>> `instance_eval'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:724:in
>>>>>> `route_eval'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:708:in
>>>>>> `route!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:758:in
>>>>>> `process_route'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:755:in
>>>>>> `catch'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:755:in
>>>>>> `process_route'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:707:in
>>>>>> `route!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:706:in
>>>>>> `each'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:706:in
>>>>>> `route!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:843:in
>>>>>> `dispatch!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:644:in
>>>>>> `call!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `instance_eval'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `invoke'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `catch'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:808:in
>>>>>> `invoke'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:644:in
>>>>>> `call!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:629:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/head.rb:9:in `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/commonlogger.rb:18:in
>>>>>> `call'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/showexceptions.rb:21:in
>>>>>> `call'
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/methodoverride.rb:24:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1272:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1303:in
>>>>>> `synchronize'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1272:in
>>>>>> `call'
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/content_length.rb:13:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/chunked.rb:15:in
>>>>>> `call'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:84:in
>>>>>> `pre_process'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:82:in
>>>>>> `catch'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:82:in
>>>>>> `pre_process'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:57:in
>>>>>> `process'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/connection.rb:42:in
>>>>>> `receive_data'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in
>>>>>> `run_machine'
>>>>>>
>>>>>>
>>>>>> /usr/lib/ruby/gems/1.8/gems/eventmachine-0.12.10/lib/eventmachine.rb:256:in
>>>>>> `run'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/backends/base.rb:61:in
>>>>>> `start'
>>>>>> /usr/lib/ruby/gems/1.8/gems/thin-1.2.8/lib/thin/server.rb:159:in
>>>>>> `start'
>>>>>> /usr/lib/ruby/gems/1.8/gems/rack-1.1.0/lib/rack/handler/thin.rb:14:in
>>>>>> `run'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/base.rb:1234:in
>>>>>> `run!'
>>>>>> /usr/lib/ruby/gems/1.8/gems/sinatra-1.2.6/lib/sinatra/main.rb:25
>>>>>> /usr/lib/one/ruby/cloud/econe/econe-server.rb:165
>>>>>>
>>>>>> What I mean with slow is that to get a list of O(400) virtual machines
>>>>>> the
>>>>>> system needs O(5-10) minutes while local queries with onevm list take
>>>>>> ~2
>>>>>> seconds.
>>>>>>
>>>>>> We are currently bitten badly by this problem because we use this to
>>>>>> refill
>>>>>> our virtual batch farm. Is there a downgrade path ?
>>>>>>
>>>>>> Any idea ?
>>>>>>
>>>>>> Thanks,
>>>>>> Ulrich
>>>>>>
>>>>>> --
>>>>>> --------------------------------------
>>>>>> Dr. Ulrich Schwickerath
>>>>>> CERN IT/PES-PS
>>>>>> 1211 Geneva 23
>>>>>> e-mail: ulrich.schwickerath at cern.ch
>>>>>> phone: +41 22 767 9576
>>>>>> mobile: +41 76 487 5602
>>>>>>
>>>>>> _______________________________________________
>>>>>> Users mailing list
>>>>>> Users at lists.opennebula.org
>>>>>> http://lists.opennebula.org/listinfo.cgi/users-opennebula.org
>>>>>
>>>>>
>>>> --
>>>> --------------------------------------
>>>> Dr. Ulrich Schwickerath
>>>> CERN IT/PES-PS
>>>> 1211 Geneva 23
>>>> e-mail: ulrich.schwickerath at cern.ch
>>>> phone: +41 22 767 9576
>>>> mobile: +41 76 487 5602
>>>>
>>>
>>
>> --
>> --------------------------------------
>> Dr. Ulrich Schwickerath
>> CERN IT/PES-PS
>> 1211 Geneva 23
>> e-mail: ulrich.schwickerath at cern.ch
>> phone: +41 22 767 9576
>> mobile: +41 76 487 5602
>>
>> _______________________________________________
>> Users mailing list
>> Users at lists.opennebula.org
>> http://lists.opennebula.org/listinfo.cgi/users-opennebula.org
>
>
--
--------------------------------------
Dr. Ulrich Schwickerath
CERN IT/PES-PS
1211 Geneva 23
e-mail: ulrich.schwickerath at cern.ch
phone: +41 22 767 9576
mobile: +41 76 487 5602
More information about the Users
mailing list