[one-users] libvirt unstalbe with open nebula

Javier Fontan jfontan at gmail.com
Tue Mar 29 09:34:17 PDT 2011


Unfortunately I don't have any clue on what can be happening. The info
I can give you is that opennebula is monitoring hosts and VMs and
hosts both using libvirt commands, maybe one of those commands makes
libvirt unhappy. Can you check executing the commands manually?

virsh --connect qemu:///system --readonly dominfo <vm>
virsh --connect qemu:///system --readonly list
virsh --connect qemu:///system --readonly dumpxml <vm>
virsh --connect qemu:///system --readonly domifstat <vm>

virsh -c qemu:///system nodeinfo

On Fri, Mar 25, 2011 at 7:07 AM, christophe bonnaud <takyon77 at gmail.com> wrote:
> Hi,
>
> I have install opennebula vers. 2.0.1-1 with libvirt vers. 0.7.7
>
> When I create a virtual machine, it's working fine but few minutes after,
> libvirt crash.
>
> If I don't run any virtual machine, libvrit has no problem.
>
> If I start a machine manually using virsh:
>    - if opennebula is working: libvirt has the same  problem.
>    - if opennebula is stop: libvirt has no problem
>
> If seems that every 10 minutes, when opennebula try to monitor the virtual
> machine, libvirt crash at the same moment.
>
> In oned.log I have:
>
> Fri Mar 25 14:59:02 2011 [ReM][D]: HostPoolInfo method invoked
> Fri Mar 25 14:59:02 2011 [ReM][D]: VirtualMachinePoolInfo method invoked
> Fri Mar 25 14:59:16 2011 [VMM][I]: Monitoring VM 29.
> Fri Mar 25 14:59:32 2011 [ReM][D]: HostPoolInfo method invoked
> Fri Mar 25 14:59:32 2011 [ReM][D]: VirtualMachinePoolInfo method invoked
>
> And if I run libvirt in gdb + debug:
>
> 14:59:16.135: debug : virEventRunOnce:592 : Poll on 8 handles 0x2aaaac00b390
> timeout -1
> 14:59:16.135: debug : virEventRunOnce:594 : Poll got 1 event
> 14:59:16.135: debug : virEventDispatchTimeouts:404 : Dispatch 2
> 14:59:16.135: debug : virEventDispatchHandles:449 : Dispatch 8
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=0 w=1
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=1 w=2
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=2 w=3
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=3 w=4
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=4 w=5
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=5 w=6
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=6 w=7
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=7 w=8
> 14:59:16.135: debug : virEventDispatchHandles:476 : Dispatch n=7 f=15 w=8
> e=4 0x8aa160
> 14:59:16.135: debug : virEventUpdateHandleImpl:146 : Update handle w=8 e=1
> 14:59:16.135: debug : virEventInterruptLocked:663 : Skip interrupt, 1
> 1084229952
> 14:59:16.135: debug : virEventCleanupTimeouts:494 : Cleanup 2
> 14:59:16.135: debug : virEventCleanupHandles:535 : Cleanupo 8
> 14:59:16.135: debug : virEventCleanupTimeouts:494 : Cleanup 2
> 14:59:16.135: debug : virEventCleanupHandles:535 : Cleanupo 8
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=0 w=1, f=5 e=1
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=1 w=2, f=7 e=1
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=2 w=3, f=12 e=25
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=3 w=4, f=13 e=25
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=4 w=5, f=14 e=1
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=5 w=6, f=10 e=25
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=6 w=7, f=9 e=25
> 14:59:16.135: debug : virEventMakePollFDs:372 : Prepare n=7 w=8, f=15 e=1
> 14:59:16.135: debug : virEventCalculateTimeout:313 : Calculate expiry of 2
> timers
> 14:59:16.135: debug : virEventCalculateTimeout:343 : Timeout at 0 due in -1
> ms
> 14:59:16.135: debug : virEventRunOnce:592 : Poll on 8 handles 0x2aaaac0078f0
> timeout -1
> 14:59:16.135: debug : virEventRunOnce:594 : Poll got 1 event
> 14:59:16.135: debug : virEventDispatchTimeouts:404 : Dispatch 2
> 14:59:16.135: debug : virEventDispatchHandles:449 : Dispatch 8
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=0 w=1
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=1 w=2
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=2 w=3
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=3 w=4
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=4 w=5
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=5 w=6
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=6 w=7
> 14:59:16.135: debug : virEventDispatchHandles:463 : i=7 w=8
> 14:59:16.135: debug : virEventDispatchHandles:476 : Dispatch n=7 f=15 w=8
> e=1 0x8aa160
> 14:59:16.135: debug : virEventUpdateHandleImpl:146 : Update handle w=8 e=1
> 14:59:16.135: debug : virEventInterruptLocked:663 : Skip interrupt, 1
> 1084229952
> 14:59:16.135: debug : virEventUpdateHandleImpl:146 : Update handle w=8 e=1
> 14:59:16.135: debug : virEventInterruptLocked:663 : Skip interrupt, 1
> 1084229952
> 14:59:16.136: debug : virEventCleanupTimeouts:494 : Cleanup 2
> 14:59:16.136: debug : virEventCleanupHandles:535 : Cleanupo 8
> 14:59:16.136: debug : remoteDispatchClientRequest:368 : prog=536903814 ver=1
> type=0 satus=0 serial=6 proc=122
> 14:59:16.136: debug : virEventCleanupTimeouts:494 : Cleanup 2
> 14:59:16.136: debug : virEventCleanupHandles:535 : Cleanupo 8
> 14:59:16.136: debug : virEventMakePollFDs:372 : Prepare n=0 w=1, f=5 e=1
> 14:59:16.136: debug : virEventMakePollFDs:372 : Prepare n=1 w=2, f=7 e=1
> 14:59:16.136: debug : virEventMakePollFDs:372 : Prepare n=2 w=3, f=12 e=25
> 14:59:16.136: debug : virEventMakePollFDs:372 : Prepare n=3 w=4, f=13 e=25
>
> Program received signal SIGSEGV, Segmentation fault.
> [Switching to Thread 0x43204940 (LWP 32689)]
> 0x00000033c0a79a30 in strlen () from /lib64/libc.so.6
> (gdb) backtrace
> #0  0x00000033c0a79a30 in strlen () from /lib64/libc.so.6
> #1  0x000000000043d861 in qemudNodeGetSecurityModel (conn=<value optimized
> out>, secmodel=0x43203c20) at qemu/qemu_driver.c:4910
> #2  0x0000003972458589 in virNodeGetSecurityModel (conn=0x0, secmodel=0x0)
> at libvirt.c:5118
> #3  0x000000000041ef1b in remoteDispatchNodeGetSecurityModel (server=<value
> optimized out>, client=<value optimized out>, conn=0x2aaaac007740,
> hdr=<value optimized out>, rerr=0x43203f30,
>     args=<value optimized out>, ret=0x43203e80) at remote.c:1306
> #4  0x000000000041fbc1 in remoteDispatchClientCall (server=0x8aa160,
> client=0x8ad920, msg=0x2aaaac03e700) at dispatch.c:506
> #5  0x000000000041ff62 in remoteDispatchClientRequest (server=0x8aa160,
> client=0x8ad920, msg=0x2aaaac03e700) at dispatch.c:388
> #6  0x0000000000416ad7 in qemudWorker (data=<value optimized out>) at
> libvirtd.c:1528
> #7  0x00000033c160673d in start_thread () from /lib64/libpthread.so.0
> #8  0x00000033c0ad3f6d in clone () from /lib64/libc.so.6
>
> ( I can provide more logs if necessary )
>
> Does anyone met this situation or have any clues concerning the origin of
> this problem?
>
> Cheers,
>
> Chris.
>
>
> --
> ------------------------------------------------------
> Bonnaud Christophe
> GSDC
> Korea Institute of Science and Technology Information
> Fax. +82-42-869-0789
> Tel. +82-42-869-0660
> Mobile +82-10-4664-3193
>
> _______________________________________________
> Users mailing list
> Users at lists.opennebula.org
> http://lists.opennebula.org/listinfo.cgi/users-opennebula.org
>
>



-- 
Javier Fontan, Grid & Virtualization Technology Engineer/Researcher
DSA Research Group: http://dsa-research.org
Globus GridWay Metascheduler: http://www.GridWay.org
OpenNebula Virtual Infrastructure Engine: http://www.OpenNebula.org



More information about the Users mailing list