ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: running playbook inside collection fedora.linux_system_roles Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_mount.yml ****************************************************** 1 plays in /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml PLAY [Basic mount snapshot test] *********************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:2 Saturday 15 March 2025 18:22:26 -0400 (0:00:00.051) 0:00:00.051 ******** ok: [managed-node4] META: ran handlers TASK [Setup] ******************************************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:46 Saturday 15 March 2025 18:22:27 -0400 (0:00:01.238) 0:00:01.289 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml for managed-node4 TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:10 Saturday 15 March 2025 18:22:27 -0400 (0:00:00.167) 0:00:01.457 ******** ok: [managed-node4] => { "changed": false, "stat": { "exists": false } } TASK [Set mount parent] ******************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:15 Saturday 15 March 2025 18:22:28 -0400 (0:00:00.750) 0:00:02.207 ******** ok: [managed-node4] => { "ansible_facts": { "test_mnt_parent": "/mnt" }, "changed": false } TASK [Run the storage role install base packages] ****************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:19 Saturday 15 March 2025 18:22:28 -0400 (0:00:00.064) 0:00:02.272 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 15 March 2025 18:22:28 -0400 (0:00:00.054) 0:00:02.326 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 15 March 2025 18:22:28 -0400 (0:00:00.047) 0:00:02.374 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 15 March 2025 18:22:28 -0400 (0:00:00.065) 0:00:02.440 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 15 March 2025 18:22:28 -0400 (0:00:00.148) 0:00:02.589 ******** ok: [managed-node4] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 15 March 2025 18:22:29 -0400 (0:00:00.347) 0:00:02.937 ******** ok: [managed-node4] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 15 March 2025 18:22:29 -0400 (0:00:00.053) 0:00:02.990 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 15 March 2025 18:22:29 -0400 (0:00:00.023) 0:00:03.014 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 15 March 2025 18:22:29 -0400 (0:00:00.023) 0:00:03.037 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 15 March 2025 18:22:29 -0400 (0:00:00.091) 0:00:03.129 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 15 March 2025 18:22:30 -0400 (0:00:01.577) 0:00:04.706 ******** ok: [managed-node4] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 15 March 2025 18:22:30 -0400 (0:00:00.048) 0:00:04.755 ******** ok: [managed-node4] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 15 March 2025 18:22:31 -0400 (0:00:00.048) 0:00:04.803 ******** ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 15 March 2025 18:22:31 -0400 (0:00:00.638) 0:00:05.442 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 15 March 2025 18:22:31 -0400 (0:00:00.090) 0:00:05.532 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 15 March 2025 18:22:31 -0400 (0:00:00.021) 0:00:05.554 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 15 March 2025 18:22:31 -0400 (0:00:00.024) 0:00:05.579 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 15 March 2025 18:22:31 -0400 (0:00:00.033) 0:00:05.612 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 15 March 2025 18:22:32 -0400 (0:00:00.582) 0:00:06.195 ******** ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:112.service": { "name": "lvm2-pvscan@8:112.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:128.service": { "name": "lvm2-pvscan@8:128.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:144.service": { "name": "lvm2-pvscan@8:144.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:16.service": { "name": "lvm2-pvscan@8:16.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:32.service": { "name": "lvm2-pvscan@8:32.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:48.service": { "name": "lvm2-pvscan@8:48.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:64.service": { "name": "lvm2-pvscan@8:64.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:80.service": { "name": "lvm2-pvscan@8:80.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:96.service": { "name": "lvm2-pvscan@8:96.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 15 March 2025 18:22:33 -0400 (0:00:01.073) 0:00:07.269 ******** ok: [managed-node4] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 15 March 2025 18:22:33 -0400 (0:00:00.052) 0:00:07.322 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 15 March 2025 18:22:33 -0400 (0:00:00.031) 0:00:07.353 ******** ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.485) 0:00:07.839 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.043) 0:00:07.883 ******** ok: [managed-node4] => { "changed": false, "stat": { "atime": 1742077193.3850884, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1742077181.3190467, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263705, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1742077181.3190467, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744072355367375", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.323) 0:00:08.206 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.088) 0:00:08.295 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.049) 0:00:08.345 ******** ok: [managed-node4] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.052) 0:00:08.397 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.055) 0:00:08.453 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.067) 0:00:08.520 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.083) 0:00:08.603 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.070) 0:00:08.674 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 15 March 2025 18:22:34 -0400 (0:00:00.087) 0:00:08.761 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 15 March 2025 18:22:35 -0400 (0:00:00.054) 0:00:08.816 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 15 March 2025 18:22:35 -0400 (0:00:00.048) 0:00:08.864 ******** ok: [managed-node4] => { "changed": false, "stat": { "atime": 1742077104.3967779, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 15 March 2025 18:22:35 -0400 (0:00:00.333) 0:00:09.197 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 15 March 2025 18:22:35 -0400 (0:00:00.038) 0:00:09.236 ******** ok: [managed-node4] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:25 Saturday 15 March 2025 18:22:36 -0400 (0:00:00.840) 0:00:10.077 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml for managed-node4 TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:5 Saturday 15 March 2025 18:22:36 -0400 (0:00:00.172) 0:00:10.249 ******** ok: [managed-node4] => { "changed": false, "stat": { "exists": false } } TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:10 Saturday 15 March 2025 18:22:36 -0400 (0:00:00.438) 0:00:10.687 ******** ok: [managed-node4] => { "ansible_facts": { "__snapshot_is_ostree": false }, "changed": false } TASK [Ensure test packages] **************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:14 Saturday 15 March 2025 18:22:36 -0400 (0:00:00.072) 0:00:10.760 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: util-linux TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:23 Saturday 15 March 2025 18:22:37 -0400 (0:00:00.753) 0:00:11.514 ******** ok: [managed-node4] => { "changed": false, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdj\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdk\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdl\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] has partitions" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:31 Saturday 15 March 2025 18:22:38 -0400 (0:00:00.963) 0:00:12.477 ******** ok: [managed-node4] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj" ] }, "changed": false } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:36 Saturday 15 March 2025 18:22:38 -0400 (0:00:00.089) 0:00:12.567 ******** ok: [managed-node4] => { "unused_disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj" ] } TASK [Print info from find_unused_disk] **************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:44 Saturday 15 March 2025 18:22:38 -0400 (0:00:00.066) 0:00:12.633 ******** skipping: [managed-node4] => {} TASK [Show disk information] *************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:49 Saturday 15 March 2025 18:22:38 -0400 (0:00:00.052) 0:00:12.686 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:58 Saturday 15 March 2025 18:22:38 -0400 (0:00:00.039) 0:00:12.725 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create LVM logical volumes under volume groups] ************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:31 Saturday 15 March 2025 18:22:38 -0400 (0:00:00.035) 0:00:12.761 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.130) 0:00:12.892 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.075) 0:00:12.967 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.056) 0:00:13.024 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.144) 0:00:13.168 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.084) 0:00:13.253 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.050) 0:00:13.303 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.067) 0:00:13.371 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.060) 0:00:13.431 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 15 March 2025 18:22:39 -0400 (0:00:00.160) 0:00:13.592 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 15 March 2025 18:22:41 -0400 (0:00:01.838) 0:00:15.430 ******** ok: [managed-node4] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "test_vg1", "volumes": [ { "fs_type": "xfs", "name": "lv1", "size": "15%" }, { "fs_type": "xfs", "name": "lv2", "size": "50%" } ] }, { "disks": [ "sdd", "sde", "sdf" ], "name": "test_vg2", "volumes": [ { "fs_type": "xfs", "name": "lv3", "size": "10%" }, { "fs_type": "xfs", "name": "lv4", "size": "20%" } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "name": "test_vg3", "volumes": [ { "fs_type": "xfs", "name": "lv5", "size": "30%" }, { "fs_type": "xfs", "name": "lv6", "size": "25%" }, { "fs_type": "xfs", "name": "lv7", "size": "10%" }, { "fs_type": "xfs", "name": "lv8", "size": "10%" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 15 March 2025 18:22:41 -0400 (0:00:00.108) 0:00:15.539 ******** ok: [managed-node4] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 15 March 2025 18:22:41 -0400 (0:00:00.153) 0:00:15.692 ******** ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2", "xfsprogs" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 15 March 2025 18:22:46 -0400 (0:00:04.328) 0:00:20.021 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 15 March 2025 18:22:46 -0400 (0:00:00.098) 0:00:20.120 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 15 March 2025 18:22:46 -0400 (0:00:00.055) 0:00:20.175 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 15 March 2025 18:22:46 -0400 (0:00:00.052) 0:00:20.228 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 15 March 2025 18:22:46 -0400 (0:00:00.045) 0:00:20.273 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "xfsprogs-4.5.0-22.el7.x86_64 providing xfsprogs is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 xfsprogs TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 15 March 2025 18:22:47 -0400 (0:00:00.823) 0:00:21.097 ******** ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:112.service": { "name": "lvm2-pvscan@8:112.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:128.service": { "name": "lvm2-pvscan@8:128.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:144.service": { "name": "lvm2-pvscan@8:144.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:16.service": { "name": "lvm2-pvscan@8:16.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:32.service": { "name": "lvm2-pvscan@8:32.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:48.service": { "name": "lvm2-pvscan@8:48.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:64.service": { "name": "lvm2-pvscan@8:64.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:80.service": { "name": "lvm2-pvscan@8:80.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:96.service": { "name": "lvm2-pvscan@8:96.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 15 March 2025 18:22:48 -0400 (0:00:01.211) 0:00:22.309 ******** ok: [managed-node4] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 15 March 2025 18:22:48 -0400 (0:00:00.122) 0:00:22.432 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 15 March 2025 18:22:48 -0400 (0:00:00.056) 0:00:22.488 ******** changed: [managed-node4] => { "actions": [ { "action": "create format", "device": "/dev/sdj", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdi", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdh", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdg", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/test_vg3", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv8", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv7", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv6", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv5", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sdf", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sde", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdd", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/test_vg2", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/test_vg2-lv4", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg2-lv4", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg2-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg2-lv3", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sdc", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/test_vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/test_vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sdk", "/dev/sdl", "/dev/xvda1", "/dev/mapper/test_vg1-lv1", "/dev/mapper/test_vg1-lv2", "/dev/mapper/test_vg2-lv3", "/dev/mapper/test_vg2-lv4", "/dev/mapper/test_vg3-lv5", "/dev/mapper/test_vg3-lv6", "/dev/mapper/test_vg3-lv7", "/dev/mapper/test_vg3-lv8" ], "mounts": [], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg1-lv1", "_kernel_device": "/dev/dm-7", "_mount_id": "/dev/mapper/test_vg1-lv1", "_raw_device": "/dev/mapper/test_vg1-lv1", "_raw_kernel_device": "/dev/dm-7", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "15%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg1-lv2", "_kernel_device": "/dev/dm-6", "_mount_id": "/dev/mapper/test_vg1-lv2", "_raw_device": "/dev/mapper/test_vg1-lv2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg2-lv3", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/test_vg2-lv3", "_raw_device": "/dev/mapper/test_vg2-lv3", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg2-lv4", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/test_vg2-lv4", "_raw_device": "/dev/mapper/test_vg2-lv4", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv4", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "20%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg3", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg3-lv5", "_kernel_device": "/dev/dm-3", "_mount_id": "/dev/mapper/test_vg3-lv5", "_raw_device": "/dev/mapper/test_vg3-lv5", "_raw_kernel_device": "/dev/dm-3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv5", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "30%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv6", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/test_vg3-lv6", "_raw_device": "/dev/mapper/test_vg3-lv6", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv6", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/test_vg3-lv7", "_raw_device": "/dev/mapper/test_vg3-lv7", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv7", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/test_vg3-lv8", "_raw_device": "/dev/mapper/test_vg3-lv8", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv8", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 15 March 2025 18:22:56 -0400 (0:00:07.558) 0:00:30.046 ******** ok: [managed-node4] => { "changed": false, "cmd": [ "udevadm", "trigger", "--subsystem-match=block" ], "delta": "0:00:00.010426", "end": "2025-03-15 18:22:56.829307", "rc": 0, "start": "2025-03-15 18:22:56.818881" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 15 March 2025 18:22:56 -0400 (0:00:00.682) 0:00:30.729 ******** ok: [managed-node4] => { "changed": false, "stat": { "atime": 1742077193.3850884, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1742077181.3190467, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263705, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1742077181.3190467, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744072355367375", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 15 March 2025 18:22:57 -0400 (0:00:00.317) 0:00:31.046 ******** ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 15 March 2025 18:22:57 -0400 (0:00:00.600) 0:00:31.646 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 15 March 2025 18:22:57 -0400 (0:00:00.051) 0:00:31.698 ******** ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sdj", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdi", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdh", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdg", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/test_vg3", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv8", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv8", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv7", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv6", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg3-lv5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg3-lv5", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sdf", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sde", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdd", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/test_vg2", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/test_vg2-lv4", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg2-lv4", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg2-lv3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg2-lv3", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sdc", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/test_vg1", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/test_vg1-lv2", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg1-lv2", "fs_type": "xfs" }, { "action": "create device", "device": "/dev/mapper/test_vg1-lv1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/test_vg1-lv1", "fs_type": "xfs" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sdk", "/dev/sdl", "/dev/xvda1", "/dev/mapper/test_vg1-lv1", "/dev/mapper/test_vg1-lv2", "/dev/mapper/test_vg2-lv3", "/dev/mapper/test_vg2-lv4", "/dev/mapper/test_vg3-lv5", "/dev/mapper/test_vg3-lv6", "/dev/mapper/test_vg3-lv7", "/dev/mapper/test_vg3-lv8" ], "mounts": [], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg1-lv1", "_kernel_device": "/dev/dm-7", "_mount_id": "/dev/mapper/test_vg1-lv1", "_raw_device": "/dev/mapper/test_vg1-lv1", "_raw_kernel_device": "/dev/dm-7", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "15%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg1-lv2", "_kernel_device": "/dev/dm-6", "_mount_id": "/dev/mapper/test_vg1-lv2", "_raw_device": "/dev/mapper/test_vg1-lv2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg2-lv3", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/test_vg2-lv3", "_raw_device": "/dev/mapper/test_vg2-lv3", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg2-lv4", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/test_vg2-lv4", "_raw_device": "/dev/mapper/test_vg2-lv4", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv4", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "20%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg3", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg3-lv5", "_kernel_device": "/dev/dm-3", "_mount_id": "/dev/mapper/test_vg3-lv5", "_raw_device": "/dev/mapper/test_vg3-lv5", "_raw_kernel_device": "/dev/dm-3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv5", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "30%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv6", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/test_vg3-lv6", "_raw_device": "/dev/mapper/test_vg3-lv6", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv6", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/test_vg3-lv7", "_raw_device": "/dev/mapper/test_vg3-lv7", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv7", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/test_vg3-lv8", "_raw_device": "/dev/mapper/test_vg3-lv8", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv8", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.200) 0:00:31.898 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg1-lv1", "_kernel_device": "/dev/dm-7", "_mount_id": "/dev/mapper/test_vg1-lv1", "_raw_device": "/dev/mapper/test_vg1-lv1", "_raw_kernel_device": "/dev/dm-7", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "15%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg1-lv2", "_kernel_device": "/dev/dm-6", "_mount_id": "/dev/mapper/test_vg1-lv2", "_raw_device": "/dev/mapper/test_vg1-lv2", "_raw_kernel_device": "/dev/dm-6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "50%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg2-lv3", "_kernel_device": "/dev/dm-5", "_mount_id": "/dev/mapper/test_vg2-lv3", "_raw_device": "/dev/mapper/test_vg2-lv3", "_raw_kernel_device": "/dev/dm-5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg2-lv4", "_kernel_device": "/dev/dm-4", "_mount_id": "/dev/mapper/test_vg2-lv4", "_raw_device": "/dev/mapper/test_vg2-lv4", "_raw_kernel_device": "/dev/dm-4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv4", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "20%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg3", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg3-lv5", "_kernel_device": "/dev/dm-3", "_mount_id": "/dev/mapper/test_vg3-lv5", "_raw_device": "/dev/mapper/test_vg3-lv5", "_raw_kernel_device": "/dev/dm-3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv5", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "30%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv6", "_kernel_device": "/dev/dm-2", "_mount_id": "/dev/mapper/test_vg3-lv6", "_raw_device": "/dev/mapper/test_vg3-lv6", "_raw_kernel_device": "/dev/dm-2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv6", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "25%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv7", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/test_vg3-lv7", "_raw_device": "/dev/mapper/test_vg3-lv7", "_raw_kernel_device": "/dev/dm-1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv7", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv8", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/test_vg3-lv8", "_raw_device": "/dev/mapper/test_vg3-lv8", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "mount_user": null, "name": "lv8", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "10%", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.247) 0:00:32.146 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.058) 0:00:32.205 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.050) 0:00:32.256 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.087) 0:00:32.343 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.053) 0:00:32.396 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.052) 0:00:32.449 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 15 March 2025 18:22:58 -0400 (0:00:00.049) 0:00:32.499 ******** ok: [managed-node4] => { "changed": false, "stat": { "atime": 1742077104.3967779, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 15 March 2025 18:22:59 -0400 (0:00:00.309) 0:00:32.808 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 15 March 2025 18:22:59 -0400 (0:00:00.034) 0:00:32.843 ******** ok: [managed-node4] TASK [Run the snapshot role to create snapshot LVs] **************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:49 Saturday 15 March 2025 18:22:59 -0400 (0:00:00.720) 0:00:33.563 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:22:59 -0400 (0:00:00.122) 0:00:33.686 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:00 -0400 (0:00:00.093) 0:00:33.780 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:00 -0400 (0:00:00.068) 0:00:33.849 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:00 -0400 (0:00:00.046) 0:00:33.895 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:00 -0400 (0:00:00.044) 0:00:33.940 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:00 -0400 (0:00:00.103) 0:00:34.044 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module snapshot] ******* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:00 -0400 (0:00:00.617) 0:00:34.661 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:03 -0400 (0:00:02.948) 0:00:37.609 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:03 -0400 (0:00:00.042) 0:00:37.652 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:03 -0400 (0:00:00.041) 0:00:37.694 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:03 -0400 (0:00:00.038) 0:00:37.732 ******** skipping: [managed-node4] => {} TASK [Verify the snapshot LVs are created] ************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:58 Saturday 15 March 2025 18:23:04 -0400 (0:00:00.044) 0:00:37.776 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:04 -0400 (0:00:00.128) 0:00:37.905 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:04 -0400 (0:00:00.239) 0:00:38.144 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:04 -0400 (0:00:00.071) 0:00:38.216 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:04 -0400 (0:00:00.084) 0:00:38.300 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:04 -0400 (0:00:00.061) 0:00:38.362 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:04 -0400 (0:00:00.150) 0:00:38.512 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module check] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:05 -0400 (0:00:00.673) 0:00:39.186 ******** ok: [managed-node4] => { "changed": false, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:07 -0400 (0:00:01.661) 0:00:40.847 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.042) 0:00:40.890 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.041) 0:00:40.931 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.035) 0:00:40.967 ******** skipping: [managed-node4] => {} TASK [Mount the snapshot for lv1] ********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:67 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.037) 0:00:41.005 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.091) 0:00:41.096 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.062) 0:00:41.158 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.045) 0:00:41.204 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.035) 0:00:41.240 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.036) 0:00:41.277 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:07 -0400 (0:00:00.090) 0:00:41.367 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module mount] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:08 -0400 (0:00:00.589) 0:00:41.957 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:08 -0400 (0:00:00.530) 0:00:42.487 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:08 -0400 (0:00:00.074) 0:00:42.562 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:08 -0400 (0:00:00.042) 0:00:42.605 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:08 -0400 (0:00:00.038) 0:00:42.644 ******** skipping: [managed-node4] => {} TASK [Assert changes for mount] ************************************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:78 Saturday 15 March 2025 18:23:08 -0400 (0:00:00.036) 0:00:42.680 ******** ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Mount the snapshot for lv2] ********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:82 Saturday 15 March 2025 18:23:08 -0400 (0:00:00.042) 0:00:42.723 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:09 -0400 (0:00:00.114) 0:00:42.837 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:09 -0400 (0:00:00.063) 0:00:42.901 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:09 -0400 (0:00:00.046) 0:00:42.947 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:09 -0400 (0:00:00.036) 0:00:42.984 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:09 -0400 (0:00:00.047) 0:00:43.031 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:09 -0400 (0:00:00.121) 0:00:43.152 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module mount] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:10 -0400 (0:00:00.743) 0:00:43.896 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:10 -0400 (0:00:00.579) 0:00:44.475 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:10 -0400 (0:00:00.069) 0:00:44.545 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:10 -0400 (0:00:00.103) 0:00:44.648 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.126) 0:00:44.775 ******** skipping: [managed-node4] => {} TASK [Mount the snapshot for lv7] ********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:93 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.118) 0:00:44.893 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.248) 0:00:45.142 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.075) 0:00:45.218 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.070) 0:00:45.289 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.054) 0:00:45.343 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.053) 0:00:45.397 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:11 -0400 (0:00:00.158) 0:00:45.556 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module mount] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:12 -0400 (0:00:00.825) 0:00:46.382 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:13 -0400 (0:00:00.707) 0:00:47.089 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:13 -0400 (0:00:00.066) 0:00:47.155 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:13 -0400 (0:00:00.081) 0:00:47.236 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:13 -0400 (0:00:00.076) 0:00:47.313 ******** skipping: [managed-node4] => {} TASK [Mount the origin for lv6] ************************************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:104 Saturday 15 March 2025 18:23:13 -0400 (0:00:00.113) 0:00:47.426 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:14 -0400 (0:00:00.494) 0:00:47.921 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:14 -0400 (0:00:00.149) 0:00:48.070 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:14 -0400 (0:00:00.182) 0:00:48.253 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:14 -0400 (0:00:00.137) 0:00:48.390 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:14 -0400 (0:00:00.061) 0:00:48.452 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:14 -0400 (0:00:00.161) 0:00:48.613 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module mount] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:15 -0400 (0:00:00.769) 0:00:49.383 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.738) 0:00:50.121 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.086) 0:00:50.208 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.061) 0:00:50.269 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.063) 0:00:50.333 ******** skipping: [managed-node4] => {} TASK [Mount the snapshot for lv1 again for idempotence] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:116 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.058) 0:00:50.392 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.177) 0:00:50.569 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.069) 0:00:50.639 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.049) 0:00:50.689 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:16 -0400 (0:00:00.069) 0:00:50.759 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:17 -0400 (0:00:00.114) 0:00:50.874 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:17 -0400 (0:00:00.214) 0:00:51.088 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module mount] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:18 -0400 (0:00:00.911) 0:00:51.999 ******** ok: [managed-node4] => { "changed": false, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.958) 0:00:52.958 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.114) 0:00:53.072 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.083) 0:00:53.156 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.062) 0:00:53.219 ******** skipping: [managed-node4] => {} TASK [Assert no changes for mount] ********************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:127 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.059) 0:00:53.278 ******** ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Umount the snapshot for lv1] ********************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:131 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.050) 0:00:53.329 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.128) 0:00:53.457 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.068) 0:00:53.525 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.072) 0:00:53.598 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.085) 0:00:53.683 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:19 -0400 (0:00:00.060) 0:00:53.744 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:20 -0400 (0:00:00.169) 0:00:53.913 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module umount] ********* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:21 -0400 (0:00:01.064) 0:00:54.978 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.795) 0:00:55.773 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.081) 0:00:55.855 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.120) 0:00:55.975 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.072) 0:00:56.048 ******** skipping: [managed-node4] => {} TASK [Assert changes for umount] *********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:141 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.075) 0:00:56.123 ******** ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Umount again to check idempotence] *************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:145 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.058) 0:00:56.181 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.148) 0:00:56.330 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.062) 0:00:56.392 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.048) 0:00:56.441 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.038) 0:00:56.480 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.039) 0:00:56.519 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:22 -0400 (0:00:00.091) 0:00:56.610 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module umount] ********* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:23 -0400 (0:00:00.727) 0:00:57.338 ******** ok: [managed-node4] => { "changed": false, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:24 -0400 (0:00:00.883) 0:00:58.222 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:24 -0400 (0:00:00.128) 0:00:58.350 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:24 -0400 (0:00:00.113) 0:00:58.464 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:24 -0400 (0:00:00.063) 0:00:58.528 ******** skipping: [managed-node4] => {} TASK [Assert no changes for umount] ******************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:155 Saturday 15 March 2025 18:23:24 -0400 (0:00:00.049) 0:00:58.577 ******** ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Umount the snapshot for lv2] ********************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:159 Saturday 15 March 2025 18:23:24 -0400 (0:00:00.068) 0:00:58.645 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:25 -0400 (0:00:00.168) 0:00:58.814 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:25 -0400 (0:00:00.061) 0:00:58.876 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:25 -0400 (0:00:00.048) 0:00:58.925 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:25 -0400 (0:00:00.037) 0:00:58.962 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:25 -0400 (0:00:00.047) 0:00:59.011 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:25 -0400 (0:00:00.175) 0:00:59.187 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module umount] ********* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:26 -0400 (0:00:00.683) 0:00:59.871 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:26 -0400 (0:00:00.538) 0:01:00.409 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:26 -0400 (0:00:00.065) 0:01:00.474 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:26 -0400 (0:00:00.065) 0:01:00.540 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:26 -0400 (0:00:00.058) 0:01:00.598 ******** skipping: [managed-node4] => {} TASK [Umount the snapshot for lv7] ********************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:169 Saturday 15 March 2025 18:23:26 -0400 (0:00:00.059) 0:01:00.657 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:27 -0400 (0:00:00.230) 0:01:00.888 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:27 -0400 (0:00:00.099) 0:01:00.988 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:27 -0400 (0:00:00.079) 0:01:01.067 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:27 -0400 (0:00:00.064) 0:01:01.132 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:27 -0400 (0:00:00.060) 0:01:01.193 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:27 -0400 (0:00:00.170) 0:01:01.364 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module umount] ********* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:28 -0400 (0:00:00.855) 0:01:02.219 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.642) 0:01:02.862 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.062) 0:01:02.924 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.060) 0:01:02.985 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.059) 0:01:03.044 ******** skipping: [managed-node4] => {} TASK [Umount the origin for lv6] *********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:179 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.076) 0:01:03.120 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.371) 0:01:03.492 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.184) 0:01:03.676 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:29 -0400 (0:00:00.078) 0:01:03.754 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:30 -0400 (0:00:00.116) 0:01:03.871 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:30 -0400 (0:00:00.108) 0:01:03.979 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:30 -0400 (0:00:00.153) 0:01:04.133 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module umount] ********* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:31 -0400 (0:00:01.020) 0:01:05.153 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:31 -0400 (0:00:00.490) 0:01:05.644 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:31 -0400 (0:00:00.042) 0:01:05.687 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:31 -0400 (0:00:00.054) 0:01:05.741 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.043) 0:01:05.785 ******** skipping: [managed-node4] => {} TASK [Run the snapshot role remove the snapshot LVs] *************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:189 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.112) 0:01:05.897 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.310) 0:01:06.208 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.107) 0:01:06.315 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.085) 0:01:06.400 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.060) 0:01:06.461 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.055) 0:01:06.517 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:32 -0400 (0:00:00.143) 0:01:06.661 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module remove] ********* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:33 -0400 (0:00:00.902) 0:01:07.563 ******** changed: [managed-node4] => { "changed": true, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:36 -0400 (0:00:02.618) 0:01:10.182 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:36 -0400 (0:00:00.106) 0:01:10.289 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": true, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:36 -0400 (0:00:00.087) 0:01:10.376 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:36 -0400 (0:00:00.063) 0:01:10.440 ******** skipping: [managed-node4] => {} TASK [Use the snapshot_lvm_verify option to make sure remove is done] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:196 Saturday 15 March 2025 18:23:36 -0400 (0:00:00.046) 0:01:10.486 ******** TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:3 Saturday 15 March 2025 18:23:36 -0400 (0:00:00.189) 0:01:10.675 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.snapshot : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:36 -0400 (0:00:00.090) 0:01:10.765 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Check if system is ostree] ********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:10 Saturday 15 March 2025 18:23:37 -0400 (0:00:00.089) 0:01:10.854 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:15 Saturday 15 March 2025 18:23:37 -0400 (0:00:00.064) 0:01:10.919 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/set_vars.yml:19 Saturday 15 March 2025 18:23:37 -0400 (0:00:00.057) 0:01:10.976 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__snapshot_packages": [ "lvm2", "util-linux" ], "__snapshot_python": "/usr/bin/python" }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Ensure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Saturday 15 March 2025 18:23:37 -0400 (0:00:00.203) 0:01:11.180 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: lvm2 util-linux TASK [fedora.linux_system_roles.snapshot : Run snapshot module remove] ********* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 Saturday 15 March 2025 18:23:38 -0400 (0:00:00.939) 0:01:12.120 ******** ok: [managed-node4] => { "changed": false, "errors": "", "message": "", "return_code": 0 } TASK [fedora.linux_system_roles.snapshot : Print out response] ***************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:40 Saturday 15 March 2025 18:23:39 -0400 (0:00:01.171) 0:01:13.291 ******** ok: [managed-node4] => { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } } TASK [fedora.linux_system_roles.snapshot : Set result] ************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:45 Saturday 15 March 2025 18:23:39 -0400 (0:00:00.070) 0:01:13.362 ******** ok: [managed-node4] => { "ansible_facts": { "snapshot_cmd": { "changed": false, "errors": "", "failed": false, "message": "", "msg": "", "return_code": 0 } }, "changed": false } TASK [fedora.linux_system_roles.snapshot : Set snapshot_facts to the JSON results] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:49 Saturday 15 March 2025 18:23:39 -0400 (0:00:00.087) 0:01:13.450 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.snapshot : Show errors] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:54 Saturday 15 March 2025 18:23:39 -0400 (0:00:00.105) 0:01:13.555 ******** skipping: [managed-node4] => {} TASK [Cleanup] ***************************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:204 Saturday 15 March 2025 18:23:39 -0400 (0:00:00.060) 0:01:13.615 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml for managed-node4 TASK [Remove storage volumes] ************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml:7 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.217) 0:01:13.833 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.111) 0:01:13.944 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.113) 0:01:14.057 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.097) 0:01:14.155 ******** skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.240) 0:01:14.395 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.090) 0:01:14.486 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.079) 0:01:14.566 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.065) 0:01:14.631 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 15 March 2025 18:23:40 -0400 (0:00:00.057) 0:01:14.689 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 15 March 2025 18:23:41 -0400 (0:00:00.133) 0:01:14.822 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 15 March 2025 18:23:42 -0400 (0:00:01.476) 0:01:16.299 ******** ok: [managed-node4] => { "storage_pools": [ { "disks": [ "sda", "sdb", "sdc" ], "name": "test_vg1", "state": "absent", "volumes": [ { "name": "lv1", "state": "absent" }, { "name": "lv2", "state": "absent" } ] }, { "disks": [ "sdd", "sde", "sdf" ], "name": "test_vg2", "state": "absent", "volumes": [ { "name": "lv3", "state": "absent" }, { "name": "lv4", "state": "absent" } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "name": "test_vg3", "state": "absent", "volumes": [ { "name": "lv5", "state": "absent" }, { "name": "lv6", "state": "absent" }, { "name": "lv7", "state": "absent" }, { "name": "lv8", "state": "absent" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 15 March 2025 18:23:42 -0400 (0:00:00.117) 0:01:16.416 ******** ok: [managed-node4] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 15 March 2025 18:23:42 -0400 (0:00:00.090) 0:01:16.507 ******** ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 15 March 2025 18:23:48 -0400 (0:00:05.407) 0:01:21.915 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 15 March 2025 18:23:48 -0400 (0:00:00.085) 0:01:22.000 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 15 March 2025 18:23:48 -0400 (0:00:00.051) 0:01:22.052 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 15 March 2025 18:23:48 -0400 (0:00:00.048) 0:01:22.101 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Saturday 15 March 2025 18:23:48 -0400 (0:00:00.053) 0:01:22.154 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Saturday 15 March 2025 18:23:49 -0400 (0:00:00.826) 0:01:22.980 ******** ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:112.service": { "name": "lvm2-pvscan@8:112.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:128.service": { "name": "lvm2-pvscan@8:128.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:144.service": { "name": "lvm2-pvscan@8:144.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:16.service": { "name": "lvm2-pvscan@8:16.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:32.service": { "name": "lvm2-pvscan@8:32.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:48.service": { "name": "lvm2-pvscan@8:48.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:64.service": { "name": "lvm2-pvscan@8:64.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:80.service": { "name": "lvm2-pvscan@8:80.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "lvm2-pvscan@8:96.service": { "name": "lvm2-pvscan@8:96.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Saturday 15 March 2025 18:23:50 -0400 (0:00:01.213) 0:01:24.193 ******** ok: [managed-node4] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Saturday 15 March 2025 18:23:50 -0400 (0:00:00.088) 0:01:24.282 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Saturday 15 March 2025 18:23:50 -0400 (0:00:00.060) 0:01:24.342 ******** changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv8", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv7", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv6", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv5", "fs_type": null }, { "action": "destroy device", "device": "/dev/test_vg3", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdh", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdj", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdg", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdi", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/mapper/test_vg2-lv4", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg2-lv4", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg2-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg2-lv3", "fs_type": null }, { "action": "destroy device", "device": "/dev/test_vg2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdd", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdf", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sde", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/mapper/test_vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/test_vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdk", "/dev/sdl", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/sdj", "/dev/xvda1" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg1-lv1", "_mount_id": "/dev/mapper/test_vg1-lv1", "_raw_device": "/dev/mapper/test_vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 482344960, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg1-lv2", "_mount_id": "/dev/mapper/test_vg1-lv2", "_raw_device": "/dev/mapper/test_vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1606418432, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg2-lv3", "_mount_id": "/dev/mapper/test_vg2-lv3", "_raw_device": "/dev/mapper/test_vg2-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 322961408, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg2-lv4", "_mount_id": "/dev/mapper/test_vg2-lv4", "_raw_device": "/dev/mapper/test_vg2-lv4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv4", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 641728512, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg3", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg3-lv5", "_mount_id": "/dev/mapper/test_vg3-lv5", "_raw_device": "/dev/mapper/test_vg3-lv5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv5", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1283457024, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv6", "_mount_id": "/dev/mapper/test_vg3-lv6", "_raw_device": "/dev/mapper/test_vg3-lv6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv6", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1069547520, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv7", "_mount_id": "/dev/mapper/test_vg3-lv7", "_raw_device": "/dev/mapper/test_vg3-lv7", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv7", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 427819008, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv8", "_mount_id": "/dev/mapper/test_vg3-lv8", "_raw_device": "/dev/mapper/test_vg3-lv8", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv8", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 427819008, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Saturday 15 March 2025 18:24:00 -0400 (0:00:10.319) 0:01:34.661 ******** ok: [managed-node4] => { "changed": false, "cmd": [ "udevadm", "trigger", "--subsystem-match=block" ], "delta": "0:00:00.007145", "end": "2025-03-15 18:24:01.215541", "rc": 0, "start": "2025-03-15 18:24:01.208396" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Saturday 15 March 2025 18:24:01 -0400 (0:00:00.457) 0:01:35.119 ******** ok: [managed-node4] => { "changed": false, "stat": { "atime": 1742077193.3850884, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1742077181.3190467, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263705, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1742077181.3190467, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744072355367375", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Saturday 15 March 2025 18:24:01 -0400 (0:00:00.496) 0:01:35.615 ******** ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 15 March 2025 18:24:02 -0400 (0:00:00.669) 0:01:36.285 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Saturday 15 March 2025 18:24:02 -0400 (0:00:00.100) 0:01:36.386 ******** ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv8", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv8", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv7", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv6", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg3-lv5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg3-lv5", "fs_type": null }, { "action": "destroy device", "device": "/dev/test_vg3", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdh", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdj", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdg", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdi", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/mapper/test_vg2-lv4", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg2-lv4", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg2-lv3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg2-lv3", "fs_type": null }, { "action": "destroy device", "device": "/dev/test_vg2", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdd", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdf", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sde", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/mapper/test_vg1-lv2", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg1-lv2", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/test_vg1-lv1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/test_vg1-lv1", "fs_type": null }, { "action": "destroy device", "device": "/dev/test_vg1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sdb", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sdc", "fs_type": "lvmpv" }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdk", "/dev/sdl", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/sdj", "/dev/xvda1" ], "mounts": [], "packages": [ "e2fsprogs" ], "pools": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg1-lv1", "_mount_id": "/dev/mapper/test_vg1-lv1", "_raw_device": "/dev/mapper/test_vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 482344960, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg1-lv2", "_mount_id": "/dev/mapper/test_vg1-lv2", "_raw_device": "/dev/mapper/test_vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1606418432, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg2-lv3", "_mount_id": "/dev/mapper/test_vg2-lv3", "_raw_device": "/dev/mapper/test_vg2-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 322961408, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg2-lv4", "_mount_id": "/dev/mapper/test_vg2-lv4", "_raw_device": "/dev/mapper/test_vg2-lv4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv4", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 641728512, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg3", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg3-lv5", "_mount_id": "/dev/mapper/test_vg3-lv5", "_raw_device": "/dev/mapper/test_vg3-lv5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv5", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1283457024, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv6", "_mount_id": "/dev/mapper/test_vg3-lv6", "_raw_device": "/dev/mapper/test_vg3-lv6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv6", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1069547520, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv7", "_mount_id": "/dev/mapper/test_vg3-lv7", "_raw_device": "/dev/mapper/test_vg3-lv7", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv7", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 427819008, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv8", "_mount_id": "/dev/mapper/test_vg3-lv8", "_raw_device": "/dev/mapper/test_vg3-lv8", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv8", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 427819008, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Saturday 15 March 2025 18:24:02 -0400 (0:00:00.208) 0:01:36.594 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg1", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg1-lv1", "_mount_id": "/dev/mapper/test_vg1-lv1", "_raw_device": "/dev/mapper/test_vg1-lv1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 482344960, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg1-lv2", "_mount_id": "/dev/mapper/test_vg1-lv2", "_raw_device": "/dev/mapper/test_vg1-lv2", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda", "sdb", "sdc" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv2", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1606418432, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg2", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg2-lv3", "_mount_id": "/dev/mapper/test_vg2-lv3", "_raw_device": "/dev/mapper/test_vg2-lv3", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv3", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 322961408, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg2-lv4", "_mount_id": "/dev/mapper/test_vg2-lv4", "_raw_device": "/dev/mapper/test_vg2-lv4", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdd", "sde", "sdf" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv4", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 641728512, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] }, { "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "test_vg3", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "absent", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/test_vg3-lv5", "_mount_id": "/dev/mapper/test_vg3-lv5", "_raw_device": "/dev/mapper/test_vg3-lv5", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv5", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1283457024, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv6", "_mount_id": "/dev/mapper/test_vg3-lv6", "_raw_device": "/dev/mapper/test_vg3-lv6", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv6", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 1069547520, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv7", "_mount_id": "/dev/mapper/test_vg3-lv7", "_raw_device": "/dev/mapper/test_vg3-lv7", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv7", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 427819008, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, { "_device": "/dev/mapper/test_vg3-lv8", "_mount_id": "/dev/mapper/test_vg3-lv8", "_raw_device": "/dev/mapper/test_vg3-lv8", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sdg", "sdh", "sdi", "sdj" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "lv8", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 427819008, "state": "absent", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Saturday 15 March 2025 18:24:02 -0400 (0:00:00.140) 0:01:36.734 ******** ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Saturday 15 March 2025 18:24:03 -0400 (0:00:00.138) 0:01:36.873 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Saturday 15 March 2025 18:24:03 -0400 (0:00:00.081) 0:01:36.954 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Saturday 15 March 2025 18:24:03 -0400 (0:00:00.123) 0:01:37.078 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Saturday 15 March 2025 18:24:03 -0400 (0:00:00.093) 0:01:37.171 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Saturday 15 March 2025 18:24:03 -0400 (0:00:00.148) 0:01:37.319 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Saturday 15 March 2025 18:24:03 -0400 (0:00:00.078) 0:01:37.398 ******** ok: [managed-node4] => { "changed": false, "stat": { "atime": 1742077104.3967779, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Saturday 15 March 2025 18:24:04 -0400 (0:00:00.392) 0:01:37.790 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Saturday 15 March 2025 18:24:04 -0400 (0:00:00.048) 0:01:37.839 ******** ok: [managed-node4] TASK [Save unused_disk_return before verify] *********************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml:30 Saturday 15 March 2025 18:24:05 -0400 (0:00:01.041) 0:01:38.880 ******** ok: [managed-node4] => { "ansible_facts": { "unused_disks_before": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj" ] }, "changed": false } TASK [Verify that pools/volumes used in test are removed] ********************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml:34 Saturday 15 March 2025 18:24:05 -0400 (0:00:00.083) 0:01:38.964 ******** included: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml for managed-node4 TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:5 Saturday 15 March 2025 18:24:05 -0400 (0:00:00.223) 0:01:39.187 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:10 Saturday 15 March 2025 18:24:05 -0400 (0:00:00.052) 0:01:39.240 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure test packages] **************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:14 Saturday 15 March 2025 18:24:05 -0400 (0:00:00.050) 0:01:39.290 ******** ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: util-linux TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:23 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.875) 0:01:40.166 ******** ok: [managed-node4] => { "changed": false, "disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdj\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdk\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdl\" TYPE=\"disk\" SIZE=\"1073741824\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] has partitions" ] } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:31 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.353) 0:01:40.519 ******** ok: [managed-node4] => { "ansible_facts": { "unused_disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj" ] }, "changed": false } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:36 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.041) 0:01:40.561 ******** ok: [managed-node4] => { "unused_disks": [ "sda", "sdb", "sdc", "sdd", "sde", "sdf", "sdg", "sdh", "sdi", "sdj" ] } TASK [Print info from find_unused_disk] **************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:44 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.039) 0:01:40.601 ******** skipping: [managed-node4] => {} TASK [Show disk information] *************************************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:49 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.044) 0:01:40.645 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:58 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.049) 0:01:40.694 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Debug why list of unused disks has changed] ****************************** task path: /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml:40 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.047) 0:01:40.742 ******** skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node4 : ok=172 changed=12 unreachable=0 failed=0 skipped=121 rescued=0 ignored=0 Saturday 15 March 2025 18:24:06 -0400 (0:00:00.018) 0:01:40.760 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.32s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 7.56s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.41s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.33s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.snapshot : Run snapshot module snapshot ------- 2.95s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 fedora.linux_system_roles.snapshot : Run snapshot module remove --------- 2.62s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 fedora.linux_system_roles.storage : Make sure blivet is available ------- 1.84s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.snapshot : Run snapshot module check ---------- 1.66s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 fedora.linux_system_roles.storage : Make sure blivet is available ------- 1.58s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Make sure blivet is available ------- 1.48s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Gathering Facts --------------------------------------------------------- 1.24s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:2 fedora.linux_system_roles.storage : Get service facts ------------------- 1.21s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 1.21s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.snapshot : Run snapshot module remove --------- 1.17s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 fedora.linux_system_roles.storage : Get service facts ------------------- 1.07s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Update facts ------------------------ 1.04s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 fedora.linux_system_roles.snapshot : Ensure required packages are installed --- 1.02s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6 Find unused disks in the system ----------------------------------------- 0.96s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/tests/snapshot/get_unused_disk.yml:23 fedora.linux_system_roles.snapshot : Run snapshot module mount ---------- 0.96s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:14 fedora.linux_system_roles.snapshot : Ensure required packages are installed --- 0.94s /tmp/collections-Abw/ansible_collections/fedora/linux_system_roles/roles/snapshot/tasks/main.yml:6